id
stringlengths
2
8
revid
stringlengths
1
10
url
stringlengths
38
44
title
stringlengths
1
184
text
stringlengths
101
448k
4702
46176178
https://en.wikipedia.org/wiki?curid=4702
Brahui language
Brahui ( ; ; also romanised as Brahvi or Brohi) is a Dravidian language, spoken by the Brahui people primarily in central areas (Brahuistan) of the Pakistani province of Balochistan; with smaller communities of speakers scattered in parts of Iranian Baluchestan, Afghanistan, and Turkmenistan (around Merv). It is also spoken by expatriate Brahui communities in Iraq, Qatar, and the United Arab Emirates. It is isolated from the nearest Dravidian-speaking neighbouring population of South India by a distance of more than . The Kalat, Khuzdar, Mastung, Quetta, Bolan, Nasirabad, Nushki, and Kharan districts of Balochistan Province are predominantly Brahui-speaking. Brahui is the only Dravidian language that is primarily written in the Perso-Arabic script. It is also written in the Latin script. Distribution. Brahui is spoken in the central part of Pakistani Balochistan, mainly in the Kalat, Khuzdar and Mastung districts, but also in smaller numbers in neighboring districts, as well as in Afghanistan which borders Pakistani Balochistan; however, many members of the ethnic group no longer speak Brahui. There are also an unknown (but very small) number of expatriate Brahuis in the Arab States of the Persian Gulf, and Turkmenistan. History. There is no consensus as to whether Brahui is a relatively recent language introduced into Balochistan or the remnant of a formerly more widespread Dravidian language family. According to Josef Elfenbein (1989), the most common theory is that the Brahui were part of a Dravidian migration into north-western parts of the Indian subcontinent in the 3rd millennium BC, but unlike other Dravidians who migrated to the south, they remained in Sarawan and Jahlawan since before 2000 BC. However, some other scholars see it as a recent migrant language to its present region. They postulate that Brahui could only have migrated to Balochistan from central India after 1000 AD. This is contradicted by genetic evidence that shows the Brahui population to be indistinguishable from neighbouring Balochi speakers, and genetically distant from central Dravidian speakers. The main Iranian contributor to Brahui vocabulary, Balochi, is a Northwestern Iranian language, and moved to the area from the west only around 1000 AD. One scholar places the migration as late as the 13th or 14th century. The Brahui lexicon is believed to be of: 35% Perso-Arabic origin, 20% Balochi origin, 20% Indo-Aryan origin, 15% Dravidian origin, and 10% unknown origin. Franklin Southworth proposed that Brahui is not a Dravidian language, but can be linked with the remaining Dravidian languages and Elamite to form the "Zagrosian family," which originated in Southwest Asia (southern Iran) and was widely distributed in South Asia and parts of eastern West Asia before the Indo-Aryan migration. Dialects. There are no important dialectal differences. Jhalawani (southern, centered on Khuzdar) and Sarawani (northern, centered on Kalat) dialects are distinguished by the pronunciation of *h, which is retained only in the north (Elfenbein 1997). Brahui has been influenced by the Iranian languages spoken in the area, including Persian, Balochi and Pashto. Phonology. Brahui vowels show a partial length distinction between long and diphthongs and short . Brahui does not have short /e, o/ due to influence from neighbouring Indo-Aryan and Iranic languages, the PD short *e was replaced by a, ē and i, and ∗o by ō, u and a in root syllables. Brahui consonants show patterns of retroflexion but lack the aspiration distinctions found in surrounding languages and include several fricatives such as the voiceless lateral fricative , a sound not otherwise found in the region. Consonants are also very similar to those of Balochi, but Brahui has more fricatives and nasals (Elfenbein 1993). Stress. Stress in Brahui follows a quantity-based pattern, occurring either on the first long vowel or diphthong, or on the first syllable if all vowels are short. Orthography. Perso-Arabic script. Brahui is the only Dravidian language which is not known to have been written in a Brahmi-based script; instead, it has been written in the Arabic script since the second half of the 20th century. Other Dravidian languages have also been historically written in the Arabic script by the Muslim minority speakers of each respective language, namely Arabi-Tamil and Arabi-Malayalam. In Pakistan, an Urdu based Nastaʿlīq script is used in writing. Brahui orthography is unique in having the letter 'ڷ', representing the sound . Table below presents the letters adopted for Brahui orthography: Latin script. More recently, a Roman-based orthography named Brolikva (an abbreviation of "Brahui Roman Likvar") was developed by the Brahui Language Board of the University of Balochistan in Quetta and adopted by the newspaper Talár. Below is the new promoted Bráhuí Báşágal Brolikva orthography: The letters with diacritics are the long vowels, the post-alveolar or retroflex consonants, and the voiced velar or voiceless alveolar fricatives. The native alphabetic order is : b á p í s y ş v (w) x e z ź ģ f ú m n l g c t ŧ r ŕ d o đ h j k a i u ń ļ Sample text. English. All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood. Latin script. Muccá insáńk ájo o izzat ná rid aŧ barebar vadí massuno. Ofte puhí o dalíl raseńgáne. andáde ofte asi elo ton ílumí e vaddifoí e. Endangerment. According to a 2009 UNESCO report, Brahui is one of the 27 languages of Pakistan that are facing the danger of extinction. It was classified as "unsafe", the least endangered level out of the five levels of concern (Unsafe, Definitely Endangered, Severely Endangered, Critically Endangered and Extinct). This status has since been renamed to "vulnerable". Publications. Talár is the first daily newspaper in the Brahui language. It uses the new Roman orthography and is "an attempt to standardize and develop [the] Brahui language to meet the requirements of modern political, social and scientific discourse."
4706
1126265
https://en.wikipedia.org/wiki?curid=4706
Berkeley DB
Berkeley DB (BDB) is an embedded database software library for key/value data, historically significant in open-source software. Berkeley DB is written in C with API bindings for many other programming languages. BDB stores arbitrary key/data pairs as byte arrays and supports multiple data items for a single key. Berkeley DB is not a relational database, although it has database features including database transactions, multiversion concurrency control and write-ahead logging. BDB runs on a wide variety of operating systems, including most Unix-like and Windows systems, and real-time operating systems. BDB was commercially supported and developed by Sleepycat Software from 1996 to 2006. Sleepycat Software was acquired by Oracle Corporation in February 2006, who continued to develop and sell the C Berkeley DB library. In 2013 Oracle re-licensed BDB under the AGPL license and released new versions until May 2020. Bloomberg L.P. continues to develop a fork of the 2013 version of BDB within their Comdb2 database, under the original Sleepycat permissive license. Origin. Berkeley DB originated at the University of California, Berkeley as part of BSD, Berkeley's version of the Unix operating system. After 4.3BSD (1986), the BSD developers attempted to remove or replace all code originating in the original AT&T Unix from which BSD was derived. In doing so, they needed to rewrite the Unix database package. Seltzer and Yigit created a new database, unencumbered by any AT&T patents: an on-disk hash table that outperformed the existing dbm libraries. Berkeley DB itself was first released in 1991 and later included with 4.4BSD. In 1996 Netscape requested that the authors of Berkeley DB improve and extend the library, then at version 1.86, to suit Netscape's requirements for an LDAP server and for use in the Netscape browser. That request led to the creation of Sleepycat Software. This company was acquired by Oracle Corporation in February 2006. Berkeley DB 1.x releases focused on managing key/value data storage and are referred to as "Data Store" (DS). The 2.x releases added a locking system enabling concurrent access to data. This is what is known as "Concurrent Data Store" (CDS). The 3.x releases added a logging system for transactions and recovery, called "Transactional Data Store" (TDS). The 4.x releases added the ability to replicate log records and create a distributed highly available single-master multi-replica database. This is called the "High Availability" (HA) feature set. Berkeley DB's evolution has sometimes led to minor API changes or log format changes, but very rarely have database formats changed. Berkeley DB HA supports online upgrades from one version to the next by maintaining the ability to read and apply the prior release's log records. Starting with the 6.0.21 (Oracle 12c) release, all Berkeley DB products are licensed under the GNU AGPL. Previously, Berkeley DB was redistributed under the 4-clause BSD license (before version 2.0), and the Sleepycat Public License, which is an OSI-approved open-source license as well as an FSF-approved free software license. The product ships with complete source code, build script, test suite, and documentation. The comprehensive feature along with the licensing terms have led to its use in a multitude of free and open-source software. Those who do not wish to abide by the terms of the GNU AGPL, or use an older version with the Sleepycat Public License, have the option of purchasing another proprietary license for redistribution from Oracle Corporation. This technique is called dual licensing. Berkeley DB includes compatibility interfaces for some historic Unix database libraries: dbm, ndbm and hsearch (a System V and POSIX library for creating in-memory hash tables). Architecture. Berkeley DB has an architecture notably simpler than relational database management systems. Like SQLite and LMDB, it is not based on a server/client model, and does not provide support for network access programs access the database using in-process API calls. Oracle added support for SQL in 11g R2 release based on the popular SQLite API by including a version of SQLite in Berkeley DB (it uses Berkeley DB for storage). A program accessing the database is free to decide how the data is to be stored in a record. Berkeley DB puts no constraints on the record's data. The record and its key can both be up to four gigabytes long. Berkeley DB supports database features such as ACID transactions, fine-grained locking, hot backups and replication. Oracle Corporation use of name "Berkeley DB". The name "Berkeley DB" is used by Oracle Corporation for three different products, only one of which is BDB: Open-source programs still using Berkeley DB. BDB was once very widespread, but usage dropped steeply from 2013 (see licensing section). Notable software that still uses Berkeley DB for data storage include: Open-source operating systems and languages such as Perl and Python still support old BerkelyDB interfaces. The FreeBSD and OpenBSD operating systems ship Berkeley DB 1.8x to support the codice_1 operating system call used by password programs such as codice_2. Linux operating systems, including those based on Debian, and Fedora ship Berkeley DB 5.3 libraries. Licensing. Berkeley DB V2.0 and higher is available under a dual license: Switching the open source license in 2013 from the Sleepycat license to the AGPL had a major effect on open source software. Since BDB is a library, any application linking to it must be under an AGPL-compatible license. Many open source applications and all closed source applications would need to be relicensed to become AGPL-compatible, which was not acceptable to many developers and open source operating systems. By 2013 there were many alternatives to BDB, and Debian Linux was typical in their decision to completely phase out Berkeley DB, with a preference for the Lightning Memory-Mapped Database (LMDB).
4715
11063468
https://en.wikipedia.org/wiki?curid=4715
Boolean satisfiability problem
In logic and computer science, the Boolean satisfiability problem (sometimes called propositional satisfiability problem and abbreviated SATISFIABILITY, SAT or B-SAT) asks whether there exists an interpretation that satisfies a given Boolean formula. In other words, it asks whether the formula's variables can be consistently replaced by the values TRUE or FALSE to make the formula evaluate to TRUE. If this is the case, the formula is called "satisfiable", else "unsatisfiable". For example, the formula ""a" AND NOT "b" is satisfiable because one can find the values "a" = TRUE and "b" = FALSE, which make ("a" AND NOT "b") = TRUE. In contrast, "a" AND NOT "a"" is unsatisfiable. SAT is the first problem that was proven to be NP-complete—this is the Cook–Levin theorem. This means that all problems in the complexity class NP, which includes a wide range of natural decision and optimization problems, are at most as difficult to solve as SAT. There is no known algorithm that efficiently solves each SAT problem (where "efficiently" means "deterministically in polynomial time"). Although such an algorithm is generally believed not to exist, this belief has not been proven or disproven mathematically. Resolving the question of whether SAT has a polynomial-time algorithm would settle the P versus NP problem - one of the most important open problem in the theory of computing. Nevertheless, as of 2007, heuristic SAT-algorithms are able to solve problem instances involving tens of thousands of variables and formulas consisting of millions of symbols, which is sufficient for many practical SAT problems from, e.g., artificial intelligence, circuit design, and automatic theorem proving. Definitions. A "propositional logic formula", also called "Boolean expression", is built from variables, operators AND (conjunction, also denoted by ∧), OR (disjunction, ∨), NOT (negation, ¬), and parentheses. A formula is said to be "satisfiable" if it can be made TRUE by assigning appropriate logical values (i.e. TRUE, FALSE) to its variables. The "Boolean satisfiability problem" (SAT) is, given a formula, to check whether it is satisfiable. This decision problem is of central importance in many areas of computer science, including theoretical computer science, complexity theory, algorithmics, cryptography and artificial intelligence. Conjunctive normal form. A "literal" is either a variable (in which case it is called a "positive literal") or the negation of a variable (called a "negative literal"). A "clause" is a disjunction of literals (or a single literal). A clause is called a "Horn clause" if it contains at most one positive literal. A formula is in "conjunctive normal form" (CNF) if it is a conjunction of clauses (or a single clause). For example, is a positive literal, is a negative literal, and is a clause. The formula is in conjunctive normal form; its first and third clauses are Horn clauses, but its second clause is not. The formula is satisfiable, by choosing "x"1 = FALSE, "x"2 = FALSE, and "x"3 arbitrarily, since (FALSE ∨ ¬FALSE) ∧ (¬FALSE ∨ FALSE ∨ "x"3) ∧ ¬FALSE evaluates to (FALSE ∨ TRUE) ∧ (TRUE ∨ FALSE ∨ "x"3) ∧ TRUE, and in turn to TRUE ∧ TRUE ∧ TRUE (i.e. to TRUE). In contrast, the CNF formula "a" ∧ ¬"a", consisting of two clauses of one literal, is unsatisfiable, since for "a"=TRUE or "a"=FALSE it evaluates to TRUE ∧ ¬TRUE (i.e., FALSE) or FALSE ∧ ¬FALSE (i.e., again FALSE), respectively. For some versions of the SAT problem, it is useful to define the notion of a "generalized conjunctive normal form" formula, viz. as a conjunction of arbitrarily many "generalized clauses", the latter being of the form for some Boolean function "R" and (ordinary) literals . Different sets of allowed Boolean functions lead to different problem versions. As an example, "R"(¬"x","a","b") is a generalized clause, and "R"(¬"x","a","b") ∧ "R"("b","y","c") ∧ "R"("c","d",¬"z") is a generalized conjunctive normal form. This formula is used below, with "R" being the ternary operator that is TRUE just when exactly one of its arguments is. Using the laws of Boolean algebra, every propositional logic formula can be transformed into an equivalent conjunctive normal form, which may, however, be exponentially longer. For example, transforming the formula ("x"1∧"y"1) ∨ ("x"2∧"y"2) ∨ ... ∨ ("x""n"∧"y""n") into conjunctive normal form yields while the former is a disjunction of "n" conjunctions of 2 variables, the latter consists of 2"n" clauses of "n" variables. However, with use of the Tseytin transformation, we may find an equisatisfiable conjunctive normal form formula with length linear in the size of the original propositional logic formula. Complexity. SAT was the first problem known to be NP-complete, as proved by Stephen Cook at the University of Toronto in 1971 and independently by Leonid Levin at the Russian Academy of Sciences in 1973. Until that time, the concept of an NP-complete problem did not even exist. The proof shows how every decision problem in the complexity class NP can be reduced to the SAT problem for CNF formulas, sometimes called CNFSAT. A useful property of Cook's reduction is that it preserves the number of accepting answers. For example, deciding whether a given graph has a 3-coloring is another problem in NP; if a graph has 17 valid 3-colorings, then the SAT formula produced by the Cook–Levin reduction will have 17 satisfying assignments. NP-completeness only refers to the run-time of the worst case instances. Many of the instances that occur in practical applications can be solved much more quickly. See §Algorithms for solving SAT below. 3-satisfiability. Like the satisfiability problem for arbitrary formulas, determining the satisfiability of a formula in conjunctive normal form where each clause is limited to at most three literals is NP-complete also; this problem is called 3-SAT, 3CNFSAT, or 3-satisfiability. To reduce the unrestricted SAT problem to 3-SAT, transform each clause to a conjunction of clauses where are fresh variables not occurring elsewhere. Although the two formulas are not logically equivalent, they are equisatisfiable. The formula resulting from transforming all clauses is at most 3 times as long as its original; that is, the length growth is polynomial. 3-SAT is one of Karp's 21 NP-complete problems, and it is used as a starting point for proving that other problems are also NP-hard. This is done by polynomial-time reduction from 3-SAT to the other problem. An example of a problem where this method has been used is the clique problem: given a CNF formula consisting of "c" clauses, the corresponding graph consists of a vertex for each literal, and an edge between each two non-contradicting literals from different clauses; see the picture. The graph has a "c"-clique if and only if the formula is satisfiable. There is a simple randomized algorithm due to Schöning (1999) that runs in time (4/3)"n" where "n" is the number of variables in the 3-SAT proposition, and succeeds with high probability to correctly decide 3-SAT. The exponential time hypothesis asserts that no algorithm can solve 3-SAT (or indeed "k"-SAT for any ) in time (that is, fundamentally faster than exponential in "n"). Selman, Mitchell, and Levesque (1996) give empirical data on the difficulty of randomly generated 3-SAT formulas, depending on their size parameters. Difficulty is measured in number recursive calls made by a DPLL algorithm. They identified a phase transition region from almost-certainly-satisfiable to almost-certainly-unsatisfiable formulas at the clauses-to-variables ratio at about 4.26. 3-satisfiability can be generalized to k-satisfiability (k-SAT, also k-CNF-SAT), when formulas in CNF are considered with each clause containing up to "k" literals. However, since for any "k" ≥ 3, this problem can neither be easier than 3-SAT nor harder than SAT, and the latter two are NP-complete, so must be k-SAT. Some authors restrict k-SAT to CNF formulas with exactly k literals. This does not lead to a different complexity class either, as each clause with "j" < "k" literals can be padded with fixed dummy variables to . After padding all clauses, 2"k"–1 extra clauses must be appended to ensure that only can lead to a satisfying assignment. Since "k" does not depend on the formula length, the extra clauses lead to a constant increase in length. For the same reason, it does not matter whether duplicate literals are allowed in clauses, as in . Special cases of SAT. Conjunctive normal form. Conjunctive normal form (in particular with 3 literals per clause) is often considered the canonical representation for SAT formulas. As shown above, the general SAT problem reduces to 3-SAT, the problem of determining satisfiability for formulas in this form. Disjunctive normal form. SAT is trivial if the formulas are restricted to those in disjunctive normal form, that is, they are a disjunction of conjunctions of literals. Such a formula is indeed satisfiable if and only if at least one of its conjunctions is satisfiable, and a conjunction is satisfiable if and only if it does not contain both "x" and NOT "x" for some variable "x". This can be checked in linear time. Furthermore, if they are restricted to being in full disjunctive normal form, in which every variable appears exactly once in every conjunction, they can be checked in constant time (each conjunction represents one satisfying assignment). But it can take exponential time and space to convert a general SAT problem to disjunctive normal form; to obtain an example, exchange "∧" and "∨" in the above exponential blow-up example for conjunctive normal forms. Exactly-1 3-satisfiability. Another NP-complete variant of the 3-satisfiability problem is the one-in-three 3-SAT (also known variously as 1-in-3-SAT and exactly-1 3-SAT). Given a conjunctive normal form with three literals per clause, the problem is to determine whether there exists a truth assignment to the variables so that each clause has "exactly" one TRUE literal (and thus exactly two FALSE literals). Not-all-equal 3-satisfiability. Another variant is the not-all-equal 3-satisfiability problem (also called NAE3SAT). Given a conjunctive normal form with three literals per clause, the problem is to determine if an assignment to the variables exists such that in no clause all three literals have the same truth value. This problem is NP-complete, too, even if no negation symbols are admitted, by Schaefer's dichotomy theorem. Linear SAT. A 3-SAT formula is "Linear SAT" ("LSAT") if each clause (viewed as a set of literals) intersects at most one other clause, and, moreover, if two clauses intersect, then they have exactly one literal in common. An LSAT formula can be depicted as a set of disjoint semi-closed intervals on a line. Deciding whether an LSAT formula is satisfiable is NP-complete. 2-satisfiability. SAT is easier if the number of literals in a clause is limited to at most 2, in which case the problem is called 2-SAT. This problem can be solved in polynomial time, and in fact is complete for the complexity class NL. If additionally all OR operations in literals are changed to XOR operations, then the result is called exclusive-or 2-satisfiability, which is a problem complete for the complexity class SL = L. Horn-satisfiability. The problem of deciding the satisfiability of a given conjunction of Horn clauses is called Horn-satisfiability, or HORN-SAT. It can be solved in polynomial time by a single step of the unit propagation algorithm, which produces the single minimal model of the set of Horn clauses (w.r.t. the set of literals assigned to TRUE). Horn-satisfiability is P-complete. It can be seen as P's version of the Boolean satisfiability problem. Also, deciding the truth of quantified Horn formulas can be done in polynomial time. Horn clauses are of interest because they are able to express implication of one variable from a set of other variables. Indeed, one such clause ¬"x"1 ∨ ... ∨ ¬"x""n" ∨ "y" can be rewritten as "x"1 ∧ ... ∧ "x""n" → "y"; that is, if "x"1...,"x""n" are all TRUE, then "y" must be TRUE as well. A generalization of the class of Horn formulas is that of renameable-Horn formulae, which is the set of formulas that can be placed in Horn form by replacing some variables with their respective negation. For example, ("x"1 ∨ ¬"x"2) ∧ (¬"x"1 ∨ "x"2 ∨ "x"3) ∧ ¬"x"1 is not a Horn formula, but can be renamed to the Horn formula ("x"1 ∨ ¬"x"2) ∧ (¬"x"1 ∨ "x"2 ∨ ¬"y"3) ∧ ¬"x"1 by introducing "y"3 as negation of "x"3. In contrast, no renaming of ("x"1 ∨ ¬"x"2 ∨ ¬"x"3) ∧ (¬"x"1 ∨ "x"2 ∨ "x"3) ∧ ¬"x"1 leads to a Horn formula. Checking the existence of such a replacement can be done in linear time; therefore, the satisfiability of such formulae is in P as it can be solved by first performing this replacement and then checking the satisfiability of the resulting Horn formula. XOR-satisfiability. Another special case is the class of problems where each clause contains XOR (i.e. exclusive or) rather than (plain) OR operators. This is in P, since an XOR-SAT formula can also be viewed as a system of linear equations mod 2, and can be solved in cubic time by Gaussian elimination; Schaefer's dichotomy theorem. The restrictions above (CNF, 2CNF, 3CNF, Horn, XOR-SAT) bound the considered formulae to be conjunctions of subformulas; each restriction states a specific form for all subformulas: for example, only binary clauses can be subformulas in 2CNF. Schaefer's dichotomy theorem states that, for any restriction to Boolean functions that can be used to form these subformulas, the corresponding satisfiability problem is in P or NP-complete. The membership in P of the satisfiability of 2CNF, Horn, and XOR-SAT formulae are special cases of this theorem. The following table summarizes some common variants of SAT. Extensions of SAT. An extension that has gained significant popularity since 2003 is satisfiability modulo theories (SMT) that can enrich CNF formulas with linear constraints, arrays, all-different constraints, uninterpreted functions, etc. Such extensions typically remain NP-complete, but very efficient solvers are now available that can handle many such kinds of constraints. The satisfiability problem becomes more difficult if both "for all" (∀) and "there exists" (∃) quantifiers are allowed to bind the Boolean variables. An example of such an expression would be ; it is valid, since for all values of "x" and "y", an appropriate value of "z" can be found, viz. "z"=TRUE if both "x" and "y" are FALSE, and "z"=FALSE else. SAT itself (tacitly) uses only ∃ quantifiers. If only ∀ quantifiers are allowed instead, the so-called tautology problem is obtained, which is co-NP-complete. If any number of both quantifiers are allowed, the problem is called the quantified Boolean formula problem (QBF), which can be shown to be PSPACE-complete. It is widely believed that PSPACE-complete problems are strictly harder than any problem in NP, although this has not yet been proved. Using highly parallel "P systems", QBF-SAT problems can be solved in linear time. Ordinary SAT asks if there is at least one variable assignment that makes the formula true. A variety of variants deal with the number of such assignments: Other generalizations include satisfiability for first- and second-order logic, constraint satisfaction problems, 0-1 integer programming. Finding a satisfying assignment. While SAT is a decision problem, the search problem of finding a satisfying assignment reduces to SAT. That is, each algorithm which correctly answers whether an instance of SAT is solvable can be used to find a satisfying assignment. First, the question is asked on the given formula Φ. If the answer is "no", the formula is unsatisfiable. Otherwise, the question is asked on the partly instantiated formula Φ{"x"1=TRUE}, that is, Φ with the first variable "x"1 replaced by TRUE, and simplified accordingly. If the answer is "yes", then "x"1=TRUE, otherwise "x"1=FALSE. Values of other variables can be found subsequently in the same way. In total, "n"+1 runs of the algorithm are required, where "n" is the number of distinct variables in Φ. This property is used in several theorems in complexity theory: Algorithms for solving SAT. Since the SAT problem is NP-complete, only algorithms with exponential worst-case complexity are known for it. In spite of this, efficient and scalable algorithms for SAT were developed during the 2000s and have contributed to dramatic advances in the ability to automatically solve problem instances involving tens of thousands of variables and millions of constraints (i.e. clauses). Examples of such problems in electronic design automation (EDA) include formal equivalence checking, model checking, formal verification of pipelined microprocessors, automatic test pattern generation, routing of FPGAs, planning, and scheduling problems, and so on. A SAT-solving engine is also considered to be an essential component in the electronic design automation toolbox. Major techniques used by modern SAT solvers include the Davis–Putnam–Logemann–Loveland algorithm (or DPLL), conflict-driven clause learning (CDCL), and stochastic local search algorithms such as WalkSAT. Almost all SAT solvers include time-outs, so they will terminate in reasonable time even if they cannot find a solution. Different SAT solvers will find different instances easy or hard, and some excel at proving unsatisfiability, and others at finding solutions. Recent attempts have been made to learn an instance's satisfiability using deep learning techniques. SAT solvers are developed and compared in SAT-solving contests. Modern SAT solvers are also having significant impact on the fields of software verification, constraint solving in artificial intelligence, and operations research, among others.
4717
27015025
https://en.wikipedia.org/wiki?curid=4717
Bob Jones University
Bob Jones University (BJU) is a private university in Greenville, South Carolina, United States. It is known for its conservative and evangelical cultural and religious positions. The university, with approximately 2,900 students, is accredited by the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) and the Transnational Association of Christian Colleges and Schools. In 2017, the university estimated the number of its graduates at 40,184. History. During the Fundamentalist-Modernist controversy of the 1920s, Christian evangelist Bob Jones Sr. grew increasingly concerned about what he perceived to be the secularization of higher education and the influence of religious liberalism in denominational colleges. Jones recalled that in 1924, his friend William Jennings Bryan leaned over to him at a Bible conference service in Winona Lake, Indiana, and said, "If schools and colleges do not quit teaching evolution as a fact, we are going to become a nation of atheists." Though Jones was not a college graduate, he was determined to found a college. On September 12, 1927, Jones opened Bob Jones College in Lynn Haven, Florida, with 88 students. Jones said that although he had been averse to naming the school after himself, his friends overcame his reluctance "with the argument that the school would be called by that name because of my connection with it, and to attempt to give it any other name would confuse the people". Bob Jones took no salary from the college. He supported the school with personal savings and income from his evangelistic campaigns. The Florida land boom had peaked in 1925, and a hurricane in September 1926 further reduced land values. Bob Jones College barely survived bankruptcy and its move to Cleveland, Tennessee, in 1933. In the same year, the college also ended participation in intercollegiate sports. Bankrupt at the nadir of the Depression, without a home and with barely enough money to move its library and office furniture, the college became the largest liberal arts college in Tennessee thirteen years later. With the enactment of the GI Bill at the end of World War II, the need for campus expansion to accommodate increased enrollment led to a relocation to South Carolina. Though Jones had served as acting president as early as 1934, his son, Bob Jones Jr. became the school's second president in 1947 before the college moved to Greenville, South Carolina, and became Bob Jones University. In Greenville, the university more than doubled in size within two years and started an AM radio station in 1949 (1260 WMUU with 94.5 WMUU-FM signing on in 1960), film department, and art gallery—the latter of which eventually became one of the largest collections of religious art in the Western Hemisphere. During the late 1950s, BJU and alumnus Billy Graham, who had attended Bob Jones College for one semester in 1936 and received an honorary degree from the university in 1948, had a dispute over the propriety of theological conservatives cooperating with theological liberals to support evangelistic campaigns, a controversy that widened an already growing rift between separatist fundamentalists and other evangelicals. Negative publicity caused by the dispute precipitated a decline in BJU enrollment of about 10% in the years 1956–59, and seven members of the university board (of about a hundred) also resigned in support of Graham, including Graham himself and two of his staff members. When, in 1966, Graham held his only American campaign in Greenville, the university forbade BJU dormitory students to attend under penalty of expulsion. Enrollment quickly rebounded, and by 1970, there were 3,300 students, approximately 60% more than in 1958. In 1971, Bob Jones III became president at age 32, though his father, with the title of Chancellor, continued to exercise considerable administrative authority into the late 1990s. At the 2005 commencement, Stephen Jones was installed as the fourth president, and Bob Jones III assumed the title of chancellor. Stephen Jones resigned in 2014 for health reasons, and evangelist Steve Pettit was named president, the first president unrelated to the Jones family. In 2011, the university became a member of the Transnational Association of Christian Colleges and Schools (TRACS) and reinstated intercollegiate athletics. In March 2017, the university regained its federal tax exemption after a complicated restructuring divided the organization into for-profit and non-profit entities, and in June 2017, it was granted accreditation by the Southern Association of Colleges and Schools. In March 2023, Pettit resigned, effective May 5, citing his inability to work with the chairman of the university's board of trustees. Shortly thereafter, the president of the board also resigned. Vice President Alan Benson was appointed interim president for the 2023–24 school year. In May 2024, Baptist pastor and BJU alumnus Joshua Crockett was elected the university's sixth president. After he returned to his former pastorate in 2025, the Board of Trustees named Bruce McAllister, Vice President for Ministry, as the seventh president. Academics. The university comprises seven colleges and schools offering more than 60 undergraduate majors, including fourteen associate degree programs. Many of the university employees consider their positions as much ministries as jobs. It is common for retiring professors to have served the university for more than forty years, a circumstance that has contributed to the stability and conservatism of an institution that has virtually no endowment and at which faculty salaries are "sacrificial". Religious education. School of Religion. The School of Religion includes majors for both men and women, although only men train as ministerial students. In 1995, 1,290 BJU graduates were serving as senior or associate pastors in churches across the United States. In 2017 more than 100 pastors in the Upstate (South Carolina) alone were BJU graduates. Fine arts. The Division of Fine Arts has the largest faculty of the university's six undergraduate schools. Each year, the university presents an opera in the spring semester and Shakespearean plays in both the fall and spring semesters. The Division of Fine Arts includes an RTV department with a campus radio and television station, WBJU. More than a hundred concerts, recitals, and laboratory theater productions are also presented annually. Each fall, as a recruiting tool, the university sponsors a "High School Festival" in which students compete in music, art, and speech (including preaching) contests with their peers from around the country. In the spring, a similar competition sponsored by the American Association of Christian Schools, and hosted by BJU since 1977, brings thousands of national finalists to the university from around the country. In 2005, 120 of the finalists from previous years returned to BJU as freshmen. Science. Bob Jones University supports young-earth creationism, all their biology faculty are young Earth creationists and the university rejects evolution, calling it "at best an unsupportable and unworkable hypothesis". Accreditation and rankings. Bob Jones Sr. was leery of academic accreditation almost from the founding of the college, and by the early 1930s, he had publicly stated his opposition to holding regional accreditation. Jones and the college were criticized for this stance, and academic recognition, as well as student and faculty recruitment, were hindered. In 1944, Jones wrote to John Walvoord of Dallas Theological Seminary that while the university had "no objection to educational work highly standardized…. We, however, cannot conscientiously let some group of educational experts or some committee of experts who may have a behavioristic or atheistic slant on education control or even influence the administrative policies of our college." Five years later, Jones reflected that "it cost us something to stay out of an association, but we stayed out. We have lived up to our convictions." Because graduates did not benefit from accredited degrees, the faculty felt an increased responsibility to prepare their students. Early in the history of the college, there had been some hesitancy on the part of other institutions to accept BJU credits at face value, but by the 1960s, BJU alumni were being accepted by most of the major graduate and professional schools in the United States. In 2004, the university began the process of joining the Transnational Association of Christian Colleges and Schools. Candidate status—effectively, accreditation—was obtained in April 2005, and full membership in the Association was conferred in November 2006. In December 2011, BJU announced its intention to apply for regional accreditation with the Southern Association of Colleges and Schools (SACSCOC), and it received that accreditation in 2017. In 2025, "US News" ranked BJU as #17 (tie) in Regional Universities South and #4 in Best Value Schools. Political involvement. As a twelve-year-old, Bob Jones Sr. made a twenty-minute speech in defense of the Populist Party. Jones was a friend and admirer of William Jennings Bryan but also campaigned throughout the South for Herbert Hoover (and against Al Smith) during the 1928 presidential election. The authorized history of BJU notes that both Bob Jones Sr. and Bob Jones Jr. "played political hardball" when dealing with the three municipalities in which the school was successively located. For instance, in 1962, Bob Jones Sr. warned the Greenville City Council that he had "four hundred votes in his pocket and in any election he would have control over who would be elected." Bob Jones Sr.'s April 17, 1960, Easter Sunday sermon, broadcast on the radio, entitled "Is Segregation Scriptural?" served as the university position paper on race in the 60s, 70s, and 80s. The transcript was sent in pamphlet form in fund-raising letters and sold in the university bookstore. In the sermon, Jones states, "If you are against segregation and against racial separation, then you are against God Almighty." The school began a long history of supporting politicians who were considered aligned with racial segregation. Republican Party ties. From nearly the inception of Bob Jones College, a majority of students and faculty were from the northern United States, where there was a larger ratio of Republicans to Democrats than in the South (which was solidly Democratic). Therefore, almost from its founding year, BJU had a larger portion of Republicans than the surrounding community. After South Carolina Senator Strom Thurmond switched his allegiance to the Republican Party in 1964, BJU faculty members became increasingly influential in the new state Republican party. BJU alumni were elected to local political and party offices. In 1976, candidates supported by BJU faculty and alumni captured the local Republican party with unfortunate short-term political consequences, but by 1980 the religious right and the "country club" Republicans had joined forces. From then on, most Republican candidates for local and statewide offices sought the endorsement of Bob Jones III and greeted faculty/staff voters at the University Dining Common. National Republicans soon followed. Ronald Reagan spoke at the school in 1980, although the Joneses supported his opponent, John Connally, in the South Carolina primary. Later, Bob Jones III denounced Reagan as "a traitor to God's people" for choosing George H. W. Bush—whom Jones called a "devil"—as his vice president. Even later, Jones III shook Bush's hand and thanked him for being a good president. In the 1990s, other Republicans such as Dan Quayle, Pat Buchanan, Phil Gramm, Bob Dole, and Alan Keyes also spoke at BJU. Democrats were rarely invited to speak at the university, in part because they took political and social positions (especially support for abortion rights) opposed by the Religious Right. 2000 election. On February 2, 2000, then Texas Governor George W. Bush, as a candidate for president, spoke during school's chapel hour. His political opponents quickly noted his non-mention of the university's ban on interracial dating. During the Michigan primary, Bush was also criticized for not stating his opposition to the university's anti-Catholicism. The McCain campaign targeted Catholics with "Catholic Voter Alert" phone calls, reminding voters of Bush's visit to BJU. New York Republican Representative Peter King, who was supporting John McCain in the presidential primary, called Bush a tool of "anti-Catholic bigoted forces", after the visit. King described BJU as "an institution that is notorious in Ireland for awarding an honorary doctorate to Northern Ireland's tempestuous Protestant leader, Ian Paisley." Bush denied that he either knew of or approved what he regarded as BJU's intolerant policies. On February 26, Bush issued a formal letter of apology to Cardinal John Joseph O'Connor of New York for failing to denounce Bob Jones University's history of anti-Catholic statements. Bush said at a news conference following the letter's release, "I make no excuses. I had an opportunity and I missed it. I regret that...I wish I had gotten up then and seized the moment to set a tone, a tone that I had set in Texas, a positive and inclusive tone." Also during the 2000 Republican primary campaign in South Carolina, Richard Hand, a BJU professor, spread a false e-mail rumor that John McCain had fathered an illegitimate child. The McCains have an adopted daughter from Bangladesh, and later push polling also implied that the child was biracial. Withdrawal from politics. Although the March 2007 issue of "Foreign Policy" listed BJU as one of "The World's Most Controversial Religious Sites" because of its past influence on American politics, BJU has seen little political controversy since Stephen Jones became president. When asked by a "Newsweek" reporter if he wished to play a political role, Stephen Jones replied, "It would not be my choice." Further, when asked if he felt ideologically closer to his father's engagement with politics or to other evangelicals who have tried to avoid civic involvement, Jones answered, "The gospel is for individuals. The main message we have is to individuals. We're not here to save the culture." In a 2005 "Washington Post" interview, Jones dodged political questions and even admitted that he was embarrassed by "some of the more vitriolic comments" made by his predecessors. "I don't want to get specific," Jones said, "But there were things said back then that I wouldn't say today." In October 2007, when Bob Jones III, as "a private citizen," endorsed Mitt Romney for the Republican nomination for president, Stephen Jones made it clear that he wished "to stay out of politics" and that neither he nor the university had endorsed anyone. Despite a hotly contested South Carolina primary, none of the candidates appeared on the platform of BJU's Founders' Memorial Amphitorium during the 2008 election cycle. In April 2008, Stephen Jones told a reporter, "I don't think I have a political bone in my body." Renewed political engagement. In 2015 BJU reemerged as a campaign stop for conservative Republicans. Ben Carson and Ted Cruz held large on-campus rallies on two successive days in November. BJU president Steve Pettit met with Marco Rubio, Rick Perry, Mike Huckabee, and Scott Walker. Jeb Bush, Carson, Cruz, and Rubio also appeared at a 2016 Republican presidential forum at BJU. Chip Felkel, a Greenville Republican consultant, noted that some candidates closely identified "with the folks at Bob Jones. So it makes sense for them to want to be there." Nevertheless, unlike BJU's earlier periods of political involvement, Pettit did not endorse a candidate. According to Furman University political science professor Jim Guth, because Greenville has grown so much recently, it is unlikely BJU will ever again have the same political influence it had between the 1960s and the 1980s. Nevertheless, about a quarter of all BJU graduates continue to live in the Upstate, and as long-time mayor Knox White has said, "The alumni have had a big impact on every profession and walk of life in Greenville." Campus. The university occupies 205 acres at the eastern city limit of Greenville. The institution moved into its initial 25 buildings during the 1947–48 school year, and later buildings were also faced with the light yellow brick chosen for the originals. Museum and gallery. Bob Jones Jr. was a connoisseur of European art from his teen years and began collecting after World War II on about $30,000 a year authorized by the University Board of Directors. Jones first concentrated on the Italian Baroque, a style then out of favor and relatively inexpensive in the years immediately following the war. The museum's collection currently includes more than 400 European paintings from the 14th through the 19th centuries, period furniture, and a notable collection of Russian icons. The museum also includes a variety of Holy Land antiquities. The gallery is strong in Baroque paintings and includes notable works by Rubens, Tintoretto, Veronese, Cranach, Gerard David, Murillo, Mattia Preti, Ribera, van Dyck, and Gustave Doré. Included in the Museum & Gallery collection are seven large canvases, part of a series by Benjamin West painted for George III, called "The Progress of Revealed Religion", which are displayed in the War Memorial Chapel. The museum also includes a variety of Holy Land antiquities collected in the early 20th century by missionaries Frank and Barbara Bowen. Every Easter, the university and the Museum & Gallery present the "Living Gallery", a series of tableaux vivants recreating noted works of religious art using live models disguised as part of two-dimensional paintings. BJU has been criticized by some fundamentalists for promoting "false Catholic doctrine" through its art gallery because much of Baroque art was created for the Counter-Reformation. A painting by Lucas van Leyden that had been displayed in the gallery's collection for more than ten years and had been consigned to Sotheby's for sale was recognized by Interpol as art that had been stolen by the Nazis from the Mittelrhein-Museum in Koblenz. The painting was eventually returned to Germany after months of negotiations between the Mittelrhein-Museum and Julius H. Weitzner, a dealer in Old Master paintings. After the death of Bob Jones Jr., Erin Jones, the wife of BJU president Stephen Jones, became director. According to David Steel, curator of European art at the North Carolina Museum of Art, Erin Jones "brought that museum into the modern era", employing "a top-notch curator, John Nolan", and following "best practices in conservation and restoration". The museum cooperates with other institutions, lending works for outside shows such as a Rembrandt exhibit in 2011. In 2008, the BJU Museum & Gallery opened a satellite location, the Museum & Gallery at Heritage Green near downtown Greenville, which featured rotating exhibitions from the main museum and interactive children's activities. In February 2017, the Museum & Gallery closed both locations permanently. In 2018, the museum announced that a new home would be built at a yet undetermined located off the BJU campus. In 2021, Erin Jones said the museum was exploring a permanent home near the proposed downtown conference center. Library. The Mack Library (named for John Sephus Mack) holds a collection of more than 300,000 books and includes seating for 1,200 as well as a computer lab and a computer classroom. (Its ancillary, a music library, is included in the Gustafson Fine Arts Center.) Mack Library's Special Collections includes an American Hymnody Collection of about 700 titles. The "Jerusalem Chamber" is a replica of the room in Westminster Abbey in which work on the King James Version of the Bible was conducted, and it displays a collection of rare Bibles. An adjoining Memorabilia Room commemorates the life of Bob Jones Sr. and the history of the university. The library's Fundamentalism File collects periodical articles and ephemera about social and religious matters of interest to evangelicals and fundamentalists. The University Archives holds copies of all university publications, oral histories of faculty and staff members, surviving remnants of university correspondence, and pictures and artifacts related to the Jones family and the history of the university. Ancillary ministries. "Unusual Films". Both Bob Jones Sr. and Bob Jones Jr. believed that film could be an excellent medium for mass evangelism, and in 1950, the university established "Unusual Films" within the School of Fine Arts. (The studio name derives from a former BJU promotional slogan, "The World's Most Unusual University".) Bob Jones Jr. selected a speech teacher, Katherine Stenholm, as the first director. Although she had no experience in cinema, she took summer courses at the University of Southern California and received personal instruction from Hollywood specialists, such as Rudolph Sternad. Unusual Films has produced seven feature-length films, each with an evangelistic emphasis: "Wine of Morning", "Red Runs the River", "Flame in the Wind", "Sheffey", "Beyond the Night", "The Printing", and "Milltown Pride". "Wine of Morning" (1955), based on a novel by Bob Jones Jr., represented the United States at the Cannes Film Festival. The first four films are historical dramas set, respectively, in the time of Christ, the U.S. Civil War, 16th-century Spain, and the late 19th-century South—the latter a fictionalized treatment of the life of Methodist evangelist, Robert Sayers Sheffey. "Beyond the Night" closely follows an actual 20th-century missionary saga in Central Africa, and "The Printing" uses composite characters to portray the persecution of believers in the former Soviet Union. According to The Dove Foundation, "The Printing" "no doubt will urge Christian believers everywhere to appreciate the freedoms they enjoy. It is inspiring!" In 1999, Unusual Films began producing feature films for children, including "The Treasure Map", "Project Dinosaur", and "Appalachian Trial". BJU Press. BJU Press originated from the need for textbooks for the burgeoning Christian school movement. The press publishes a full range of K–12 textbooks. BJU Press also offers distance learning courses online, via DVD and hard drive. Another ancillary, the Academy of Home Education, is a "service organization for homeschooling families" that maintains student records, administers achievement testing, and issues high school diplomas. The press sold its music division, SoundForth, to Lorenz Publishing on October 1, 2012. Pre-college programs. The university operates Bob Jones Academy, which enrolls students from preschool through 12th grade. With about 1100 students, the school's demographic makeup leans heavily white (90.3%), with non-Black minorities making up the bulk of other ethnicities. Black students make up 0.5% of enrollment. Controversies. Sexual abuse reports. In December 2011, in response to accusations of mishandling of student reports of sexual abuse (most of which had occurred in their home churches when the students were minors) and a concurrent reporting issue at a church pastored by a university board member, the BJU board of trustees hired an independent ombudsman, GRACE (Godly Response to Abuse in the Christian Environment), to investigate. Released in December 2014, the GRACE report suggested that BJU had discouraged students from reporting past sexual abuse, and though the university declined to implement many of the report's recommendations, President Steve Pettit formally apologized "to those who felt they did not receive from us genuine love, compassion, understanding, and support after suffering sexual abuse or assault". The university's mishandling of sexual abuse in the past came into light again in August 2020 when a student filed a lawsuit against Bob Jones University and Furman University alleging both administrations ignored the sexual assault report and expelled the student for consuming alcohol, which is against the Student Code of Conduct handbook. Racial policies and ban on interracial dating. Although BJU had admitted Asian students and other ethnic groups from its inception, it did not enroll Black students until 1971. From 1971 to 1975, BJU admitted only married Black people. However, the Internal Revenue Service (IRS) had already determined in 1970 that "private schools with racially discriminatory admissions policies" were not entitled to federal tax exemption. In 1975, the University Board of Trustees authorized a policy change to admit Black students, a move that occurred shortly before the announcement of the Supreme Court decision in "Runyon v. McCrary" (427 U.S. 160 [1976]), which prohibited racial exclusion in private schools. In May 1975, BJU expanded rules against interracial dating and marriage. In 1976, the Internal Revenue Service revoked the university's tax exemption retroactively to December 1, 1970, because it practiced racial discrimination. The case was heard by the U.S. Supreme Court in 1982. After BJU lost the decision in "Bob Jones University v. United States" (461 U.S. 574)[1983], the university chose to maintain its interracial dating policy and pay a million dollars in back taxes. The year following the Court decision, contributions to the university declined by 13 percent. In 2000, following a media uproar prompted by the visit of presidential candidate George W. Bush to the university, Bob Jones III dropped the university's interracial dating rule, announcing the change on CNN's "Larry King Live". In the same year, Bob Jones III drew criticism after reposting a letter on the university's web page referring to Mormons and Catholics as being members of "cults which call themselves Christian". In 2005, Stephen Jones, great-grandson of the founder, became BJU's president. Bob Jones III then took the title Chancellor. In 2008, the university declared itself "profoundly sorry" for having allowed "institutional policies to remain in place that were racially hurtful". That year, BJU said it had enrolled students from fifty states and nearly fifty countries, claimed that these represented diverse ethnicities and cultures, and that the BJU administration declared itself "committed to maintaining on the campus the racial and cultural diversity and harmony characteristic of the true Church of Jesus Christ throughout the world". In his first meeting with the university cabinet in 2014, the fifth president Steve Pettit said it was appropriate for BJU to regain its tax-exempt status because BJU no longer held its earlier positions about race. "The Bible is clear," said Pettit, "We are made of one blood." By February 17, 2017, the IRS website had listed the university as a 501(c)(3) organization, and by May 2017, BJU had forged a working relationship with Greenville's Phillis Wheatley Center. In 2017, 9% of the student body was "from the American minority population". Student life. Religious atmosphere. Religion is a major aspect of life and curriculum at BJU. The BJU Creed, written in 1927 by journalist and prohibitionist Sam Small, is recited by students and faculty four days a week at chapel services. The university also encourages church planting in areas of the United States "in great need of fundamental churches", and it has provided financial and logistical assistance to ministerial graduates in starting more than a hundred new churches. Bob Jones III has also encouraged non-ministerial students to put their career plans on hold for two or three years to provide lay leadership for small churches. Students of various majors participate in Missions Advance (formerly Mission Prayer Band), an organization that prays for missionaries and attempts to stimulate campus interest in world evangelism. During summers and Christmas breaks, about 150 students participate in teams that promote Christian missions around the world. Although a separate nonprofit corporation, Gospel Fellowship Association, an organization founded by Bob Jones Sr. and associated with BJU, is one of the largest fundamentalist mission boards in the country. Through its "Timothy Fund", the university also sponsors international students who are training for the ministry. The university requires the use of the King James Version (KJV) of the Bible in its services and classrooms, but it does not hold that the KJV is the only acceptable English translation or that it has the same authority as the original Hebrew and Greek manuscripts. The university's position has been criticized by some other fundamentalists, including fellow conservative university Pensacola Christian College, which in 1998 produced a widely distributed videotape which argued that this "defiling leaven in fundamentalism" was passed from the 19th-century Princeton theologian Benjamin B. Warfield through Charles Brokenshire to current BJU faculty members and graduates. Rules of conduct. Strict rules govern student life at BJU. The 2015–16 Student Handbook states, "Students are to avoid any types of entertainment that could be considered immodest or that contain profanity, scatological realism, sexual perversion, erotic realism, lurid violence, occultism and false philosophical or religious assumptions." Grounds for immediate dismissal include stealing, immorality (including sex between unmarried students), possession of hard-core pornography, use of alcohol or drugs, and participating in a public demonstration for a cause the university opposes. Similar "moral failures" are grounds for terminating the employment of faculty and staff. In 1998, a homosexual alumnus was threatened with arrest if he visited the campus. Men are allowed to wear polo shirts or dress shirts on weekdays until 17:00. Effective in 2018, women are no longer required to wear skirts or dresses and can now wear pants. They are also required to attend chapel three days a week, as well as at least two services per week at an approved "local fundamental church". Other rules are not based on a specific biblical passage. For instance, the Handbook notes that "there is no specific Bible command that says, 'Thou shalt not be late to class', but a student who wishes to display orderliness and concern for others will not come in late to the distraction of the teacher and other students." In 2008 a campus spokesperson said that one goal of the dress code was "to teach our young people to dress professionally" on campus while giving them "the ability to...choose within the biblically accepted options of dress" when they were off campus. Additional rules include requiring resident hall students to abide by a campus curfew of 11:00 pm on class days and 12:00 am on weekends. Students are requested to not go to movie theaters while in residence; however, they may watch movies rated G or PG while in the residence halls. Students are requested not to listen to popular contemporary music. Male students and graduate students may have facial hair that is neatly trimmed and well maintained at approximately ½ inch or less. Women are expected to dress modestly and wear business casual style clothing to class and religious services. Extracurriculars. After BJU abandoned intercollegiate sports in 1933, its intramural sports program included competition in soccer, basketball, softball, volleyball, tennis, badminton, flag football, table tennis, racquetball, and water polo. The university also competed in intercollegiate debate within the National Educational Debate Association, in intercollegiate mock trial and computer science competitions, and participated at South Carolina Student Legislature. In 2012, BJU joined Division I of National Christian College Athletic Association (NCCAA) and in 2014 participated in intercollegiate soccer, basketball, cross-country, and golf. The teams are known as the Bruins. The university requires all unmarried incoming first-year students under 23 to join one of 33 "societies". Societies meet most Fridays for entertainment and fellowship and hold weekly prayer meetings. Societies compete with one another in intramural sports, debate, and Scholastic Bowl. The university also has a student-staffed newspaper ("The Collegian"), and yearbook ("Vintage"). Early in December, thousands of students, faculty, and visitors gather around the front campus fountain for an annual Christmas carol singing and lighting ceremony, illuminating tens of thousands of Christmas lights. On December 3, 2004, the ceremony broke the Guinness World Record for Christmas caroling with 7,514 carolers. Before 2015, the university required students and faculty to attend a six-day Bible Conference instead of a traditional Spring Break. However, the university announced that beginning in 2016, it would hold the Bible Conference in February and give students a week of Spring Break in March. The Conference typically attracts fundamentalist preachers and laypeople from around the country, and some BJU class reunions are held during the week. Athletics. The Bob Jones (BJU) athletic teams are called the Bruins. The university is a member of the National Christian College Athletic Association (NCCAA), primarily competing in the South Region of the Division II level. The Bruins previously competed as a member of the NCAA Division III ranks, primarily competing as an NCAA D-III Independent from 2020–21 to 2022–23. BJU competes in 11 intercollegiate varsity sports. Men's sports include basketball, baseball, cross-country, golf, soccer, and track & field, while women's sports include basketball, cross-country, soccer, track & field, and volleyball. History. In 2012, the university inaugurated intercollegiate athletics with four teams: men's soccer, men's basketball, women's soccer, and women's basketball. The university added intercollegiate golf and cross-country teams during the 2013–2014 school year. Men's and women's shooting sports were added in 2016. Men's baseball began in the spring of 2021, and women's beach volleyball started in the spring of 2022. Director of athletics Neal Ring resigned in 2023; he had overseen Bruins Athletics since inception. Through its first 11 seasons, the athletic department amassed 22 NCCAA National Championships, nearly 100 All-Americans, and over 200 Scholar-Athletes. Bruins Athletics also received six straight Presidential Awards for Excellence, honoring the most successful NCCAA DII athletics program. Move to NCAA Division III. In 2018, BJU explored National Collegiate Athletic Association (NCAA) membership and applied for it in January 2020. The Bruins were accepted as Division III provisional members in June for three years, making it the only Division III school in the state. The school has been searching for a conference. Notable people. Alumni. A number of BJU graduates have become influential within evangelical Christianity, including Ken Hay (founder of "The Wilds" Christian camps) Ron "Patch" Hamilton (composer and president of Majesty Music) Billy Kim (former president of Baptist World Alliance), and Moisés Silva (president of the Evangelical Theological Society). BJU alumni also include the third pastor (1968–1976) of Riverside Church (Ernest T. Campbell), the former president of Northland Baptist Bible College (Les Ollila), late president of Baptist Bible College (Ernest Pickering), and the former president of Clearwater Christian College (Richard Stratton). BJU alumnus Asa Hutchinson served as the governor of Arkansas and also served in the U.S. Congress; his brother Tim Hutchinson served in the U.S. Senate. Others have served in state government: Michigan state senator Alan Cropsey, Pennsylvania state representative Gordon Denlinger, Pennsylvania state representative Mark M. Gillen, former Speaker Pro Tempore of the South Carolina House of Representatives Terry Haskins, member of the South Carolina House of Representatives Wendy Nanney, Pennsylvania state representative Sam Rohrer, member of the Missouri House of Representatives Ryan Silvey, Maryland state senator Bryan Simonaire and his daughter, state delegate Meagan Simonaire, and South Carolina state senator Danny Verdin.
4721
20542576
https://en.wikipedia.org/wiki?curid=4721
British Empire
The British Empire comprised the dominions, colonies, protectorates, mandates, and other territories ruled or administered by the United Kingdom and its predecessor states. It began with the overseas possessions and trading posts established by England in the late 16th and early 17th centuries, and colonisation attempts by Scotland during the 17th century. At its height in the 19th and early 20th centuries, it became the largest empire in history and, for a century, was the foremost global power. By 1913, the British Empire held sway over 412 million people, of the world population at the time, and by 1920, it covered , of the Earth's total land area. As a result, its constitutional, legal, linguistic, and cultural legacy is widespread. At the peak of its power, it was described as "the empire on which the sun never sets", as the sun was always shining on at least one of its territories. During the Age of Discovery in the 15th and 16th centuries, Portugal and Spain pioneered European exploration of the globe, and in the process established large overseas empires. Envious of the great wealth these empires generated, England, France, and the Netherlands began to establish colonies and trade networks of their own in the Americas and Asia. A series of wars in the 17th and 18th centuries with the Netherlands and France left Britain the dominant colonial power in North America. Britain became a major power in the Indian subcontinent after the East India Company's conquest of Mughal Bengal at the Battle of Plassey in 1757. The American War of Independence resulted in Britain losing some of its oldest and most populous colonies in North America by 1783. While retaining control of British North America (now Canada) and territories in and near the Caribbean in the British West Indies, British colonial expansion turned towards Asia, Africa, and the Pacific. After the defeat of France in the Napoleonic Wars (1803–1815), Britain emerged as the principal naval and imperial power of the 19th century and expanded its imperial holdings. It pursued trade concessions in China and Japan, and territory in Southeast Asia. The Great Game and Scramble for Africa also ensued. The period of relative peace (1815–1914) during which the British Empire became the global hegemon was later described as (Latin for "British Peace"). Alongside the formal control that Britain exerted over its colonies, its dominance of much of world trade, and of its oceans, meant that it effectively controlled the economies of, and readily enforced its interests in, many regions, such as Asia and Latin America. It also came to dominate the Middle East. Increasing degrees of autonomy were granted to its white settler colonies, some of which were formally reclassified as Dominions by the 1920s. By the start of the 20th century, Germany and the United States had begun to challenge Britain's economic lead. Military, economic and colonial tensions between Britain and Germany were major causes of the First World War, during which Britain relied heavily on its empire. The conflict placed enormous strain on its military, financial, and manpower resources. Although the empire achieved its largest territorial extent immediately after the First World War, Britain was no longer the world's preeminent industrial or military power. In the Second World War, Britain's colonies in East Asia and Southeast Asia were occupied by the Empire of Japan. Despite the final victory of Britain and its allies, the damage to British prestige and the British economy helped accelerate the decline of the empire. India, Britain's most valuable and populous possession, achieved independence in 1947 as part of a larger decolonisation movement, in which Britain granted independence to most territories of the empire. The Suez Crisis of 1956 confirmed Britain's decline as a global power, and the handover of Hong Kong to China on 1 July 1997 symbolised for many the end of the British Empire, though fourteen overseas territories that are remnants of the empire remain under British sovereignty. After independence, many former British colonies, along with most of the dominions, joined the Commonwealth of Nations, a free association of independent states. Fifteen of these, including the United Kingdom, retain the same person as monarch, currently King Charles III. Origins (1497–1583). The foundations of the British Empire were laid when England and Scotland were separate kingdoms. In 1496, King Henry VII of England, following the successes of Spain and Portugal in overseas exploration, commissioned John Cabot to lead an expedition to discover a northwest passage to Asia via the North Atlantic. Cabot sailed in 1497, five years after the first voyage of Christopher Columbus, and made landfall on the coast of Newfoundland. He believed he had reached Asia, and there was no attempt to found a colony. Cabot led another voyage to the Americas the following year but did not return; it is unknown what happened to his ships. No further attempts to establish English colonies in the Americas were made until well into the reign of Queen Elizabeth I, during the last decades of the 16th century. In the meantime, Henry VIII's 1533 Statute in Restraint of Appeals had declared "that this realm of England is an Empire". The Protestant Reformation turned England and Catholic Spain into implacable enemies. In 1562, Elizabeth I encouraged the privateers John Hawkins and Francis Drake to engage in slave-raiding attacks against Spanish and Portuguese ships off the coast of West Africa with the aim of establishing an Atlantic slave trade. This effort was rebuffed and later, as the Anglo-Spanish Wars intensified, Elizabeth I gave her blessing to further privateering raids against Spanish ports in the Americas and shipping that was returning across the Atlantic, laden with treasure from the New World. At the same time, influential writers such as Richard Hakluyt and John Dee (who was the first to use the term "British Empire") were beginning to press for the establishment of England's own empire. By this time, Spain had become the dominant power in the Americas and was exploring the Pacific Ocean, Portugal had established trading posts and forts from the coasts of Africa and Brazil to China, and France had begun to settle the Saint Lawrence River area, later to become New France. Although England tended to trail behind Portugal, Spain, and France in establishing overseas colonies, it carried out its first modern colonisation, referred to as the Munster Plantations, in 16th century Ireland by settling it with English and Welsh Protestant settlers. England had already colonised part of the country following the Norman invasion of Ireland in 1169. Several people who helped establish the Munster plantations later played a part in the early colonisation of North America, particularly a group known as the West Country Men. English overseas possessions (1583–1707). In 1578, Elizabeth I granted a patent to Humphrey Gilbert for discovery and overseas exploration. That year, Gilbert sailed for the Caribbean with the intention of engaging in piracy and establishing a colony in North America, but the expedition was aborted before it had crossed the Atlantic. In 1583, he embarked on a second attempt. On this occasion, he formally claimed the harbour of the island of Newfoundland, although no settlers were left behind. Gilbert did not survive the return journey to England and was succeeded by his half-brother, Walter Raleigh, who was granted his own patent by Elizabeth in 1584. Later that year, Raleigh founded the Roanoke Colony on the coast of present-day North Carolina, but lack of supplies caused the colony to fail. In 1603, James VI of Scotland ascended (as James I) to the English throne and in 1604 negotiated the Treaty of London, ending hostilities with Spain. Now at peace with its main rival, English attention shifted from preying on other nations' colonial infrastructures to the business of establishing its own overseas colonies. The British Empire began to take shape during the early 17th century, with the English settlement of North America and the smaller islands of the Caribbean, and the establishment of joint-stock companies, most notably the East India Company, to administer colonies and overseas trade. This period, until the loss of the Thirteen Colonies after the American War of Independence towards the end of the 18th century, has been referred to by some historians as the "First British Empire". Americas, Africa and the slave trade. England's early efforts at colonisation in the Americas met with mixed success. An attempt to establish a colony in Guiana in 1604 lasted only two years and failed in its main objective to find gold deposits. Colonies on the Caribbean islands of St Lucia (1605) and Grenada (1609) rapidly folded. The first permanent English settlement in the Americas was founded in 1607 in Jamestown by Captain John Smith, and managed by the Virginia Company; the Crown took direct control of the venture in 1624, thereby founding the Colony of Virginia. Bermuda was settled and claimed by England as a result of the 1609 shipwreck of the Virginia Company's flagship, while attempts to settle Newfoundland were largely unsuccessful. In 1620, Plymouth was founded as a haven by Puritan religious separatists, later known as the Pilgrims. Fleeing from religious persecution would become the motive for many English would-be colonists to risk the arduous trans-Atlantic voyage: Maryland was established by English Roman Catholics (1634), Rhode Island (1636) as a colony tolerant of all religions and Connecticut (1639) for Congregationalists. England's North American holdings were further expanded by the annexation of the Dutch colony of New Netherland in 1664, following the capture of New Amsterdam, which was renamed New York. Although less financially successful than colonies in the Caribbean, these territories had large areas of good agricultural land and attracted far greater numbers of English emigrants, who preferred their temperate climates. The British West Indies initially provided England's most important and lucrative colonies. Settlements were successfully established in St. Kitts (1624), Barbados (1627) and Nevis (1628), but struggled until the "Sugar Revolution" transformed the Caribbean economy in the mid-17th century. Large sugarcane plantations were first established in the 1640s on Barbados, with assistance from Dutch merchants and Sephardic Jews fleeing Portuguese Brazil. At first, sugar was grown primarily using white indentured labour, but rising costs soon led English traders to embrace the use of imported African slaves. The enormous wealth generated by slave-produced sugar made Barbados the most successful colony in the Americas, and one of the most densely populated places in the world. This boom led to the spread of sugar cultivation across the Caribbean, financed the development of non-plantation colonies in North America, and accelerated the growth of the Atlantic slave trade, particularly the triangular trade of slaves, sugar and provisions between Africa, the West Indies and Europe. To ensure that the increasingly healthy profits of colonial trade remained in English hands, Parliament decreed in 1651 that only English ships would be able to ply their trade in English colonies. This led to hostilities with the United Dutch Provinces—a series of Anglo-Dutch Wars—which would eventually strengthen England's position in the Americas at the expense of the Dutch. In 1655, England annexed the island of Jamaica from the Spanish, and in 1666 succeeded in colonising the Bahamas. In 1670, Charles II incorporated by royal charter the Hudson's Bay Company (HBC), granting it a monopoly on the fur trade in the area known as Rupert's Land, which would later form a large proportion of the Dominion of Canada. Forts and trading posts established by the HBC were frequently the subject of attacks by the French, who had established their own fur trading colony in adjacent New France. Two years later, the Royal African Company was granted a monopoly on the supply of slaves to the British colonies in the Caribbean. The company would transport more slaves across the Atlantic than any other, and significantly grew England's share of the trade, from 33 per cent in 1673 to 74 per cent in 1683. The removal of this monopoly between 1688 and 1712 allowed independent British slave traders to thrive, leading to a rapid escalation in the number of slaves transported. British ships carried a third of all slaves shipped across the Atlantic—approximately 3.5 million Africans—until the abolition of the trade by Parliament in 1807 (see ). To facilitate the shipment of slaves, forts were established on the coast of West Africa, such as James Island, Accra and Bunce Island. In the British Caribbean, the percentage of the population of African descent rose from 25 per cent in 1650 to around 80 per cent in 1780, and in the Thirteen Colonies from 10 per cent to 40 per cent over the same period (the majority in the southern colonies). The transatlantic slave trade played a pervasive role in British economic life, and became a major economic mainstay for western port cities. Ships registered in Bristol, Liverpool and London were responsible for the bulk of British slave trading. For the transported, harsh and unhygienic conditions on the slaving ships and poor diets meant that the average mortality rate during the Middle Passage was one in seven. Rivalry with other European empires. At the end of the 16th century, England and the Dutch Empire began to challenge the Portuguese Empire's monopoly of trade with Asia, forming private joint-stock companies to finance the voyages—the English, later British, East India Company and the Dutch East India Company, chartered in 1600 and 1602 respectively. The primary aim of these companies was to tap into the lucrative spice trade, an effort focused mainly on two regions: the East Indies archipelago, and an important hub in the trade network, India. There, they competed for trade supremacy with Portugal and with each other. Although England eclipsed the Netherlands as a colonial power, in the short term the Netherlands' more advanced financial system and the three Anglo-Dutch Wars of the 17th century left it with a stronger position in Asia. Hostilities ceased after the Glorious Revolution of 1688 when the Dutch William of Orange ascended the English throne, bringing peace between the Dutch Republic and England. A deal between the two nations left the spice trade of the East Indies archipelago to the Netherlands and the textiles industry of India to England, but textiles soon overtook spices in terms of profitability. Peace between England and the Netherlands in 1688 meant the two countries entered the Nine Years' War as allies, but the conflict—waged in Europe and overseas between France, Spain and the Anglo-Dutch alliance—left the English a stronger colonial power than the Dutch, who were forced to devote a larger proportion of their military budget to the costly land war in Europe. The death of Charles II of Spain in 1700 and his bequeathal of Spain and its colonial empire to Philip V of Spain, a grandson of the King of France, raised the prospect of the unification of France, Spain and their respective colonies, an unacceptable state of affairs for England and the other powers of Europe. In 1701, England, Portugal and the Netherlands sided with the Holy Roman Empire against Spain and France in the War of the Spanish Succession, which lasted for thirteen years. Colonies and territories of Scotland (1629–1707). Colonisation attempts by the Kingdom of Scotland predates the establishment of the United Kingdom of Great Britain and the British Empire. During the 17th century, the Kingdom of Scotland made attempts to establish trading schemes in Ireland and Canada. Nova Scotia, today a province of Canada, was a short lived scheme in the Mi'kmaq people territory. Additionally, there was a high number of Scots in Ireland, particularly in the region of Ulster, who settled there as planters. North America. The first documented Scottish settlement in the Americas was of Nova Scotia in 1629. On 29 September 1621, the charter for the foundation of a colony was granted by James VI of Scotland to Sir William Alexander. Between 1622 and 1628, Sir William launched four attempts to send colonists to Nova Scotia; all failed for various reasons. A successful settlement of Nova Scotia was finally achieved in 1629. The colony's charter, in law, made Nova Scotia (defined as all land between Newfoundland and New England; i.e., The Maritimes) a part of mainland Scotland; this was later used to get around the English navigation acts. As a country, Scotland engaged in two ventures in an attempt to establish colonies within English colonies during the 1680s in Carolina and East New Jersey. The attempt to establish themselves in Carolina was largely in part due to Scottish Presbyterians fleeing from the threat of religious persecution in Scotland, and were assisted in their efforts by Scottish merchants who were aiming to develop transatlantic trade which was not subjected to the English Navigation Acts. The Act restricted Scottish trade with English colonies. Scottish settlement attempts in Carolina began in 1682, and was eventually defeated by the dissolution of the Scottish settlement of Stewartstoun, which was established in 1684, by the Spanish in 1686. South America. In 1695, the Parliament of Scotland granted a charter to the Company of Scotland, which established a settlement in 1698 on the Isthmus of Panama. Besieged by neighbouring Spanish colonists of New Granada, and affected by malaria, the colony was abandoned two years later. The Darien scheme was a financial disaster for Scotland: a quarter of Scottish capital was lost in the enterprise. The episode had major political consequences, helping to persuade the government of the Kingdom of Scotland of the merits of turning the personal union with England into a political and economic one under the Kingdom of Great Britain established by the Acts of Union 1707. British Empire (1707–1783). The 18th century saw the newly united Great Britain rise to be the world's dominant colonial power, with France becoming its main rival on the imperial stage. Great Britain, Portugal, the Netherlands, and the Holy Roman Empire continued the War of the Spanish Succession, which lasted until 1714 and was concluded by the Treaty of Utrecht. Philip V of Spain renounced his and his descendants' claim to the French throne, and Spain lost its empire in Europe. The British Empire was territorially enlarged: from France, Britain gained Newfoundland and Acadia, and from Spain, Gibraltar and Menorca. Gibraltar became a critical naval base and allowed Britain to control the Atlantic entry and exit point to the Mediterranean. Spain ceded the rights to the lucrative "asiento" (permission to sell African slaves in Spanish America) to Britain. With the outbreak of the Anglo-Spanish War of Jenkins' Ear in 1739, Spanish privateers attacked British merchant shipping along the Triangle Trade routes. In 1746, the Spanish and British began peace talks, with the King of Spain agreeing to stop all attacks on British shipping; however, in the 1750 Treaty of Madrid Britain lost its slave-trading rights in Latin America. In the East Indies, British and Dutch merchants continued to compete in spices and textiles. With textiles becoming the larger trade, by 1720, in terms of sales, the British company had overtaken the Dutch. During the middle decades of the 18th century, there were several outbreaks of military conflict on the Indian subcontinent, as the English East India Company and its French counterpart, struggled alongside local rulers to fill the vacuum that had been left by the decline of the Mughal Empire. The Battle of Plassey in 1757, in which the British defeated the Nawab of Bengal and his French allies, left the British East India Company in control of Bengal and as a major military and political power in India. France was left control of its enclaves but with military restrictions and an obligation to support British client states, ending French hopes of controlling India. In the following decades the British East India Company gradually increased the size of the territories under its control, either ruling directly or via local rulers under the threat of force from the Presidency Armies, the vast majority of which was composed of Indian sepoys, led by British officers. The British and French struggles in India became but one theatre of the global Seven Years' War (1756–1763) involving France, Britain, and the other major European powers. The signing of the Treaty of Paris of 1763 had important consequences for the future of the British Empire. In North America, France's future as a colonial power effectively ended with the recognition of British claims to Rupert's Land, and the ceding of New France to Britain (leaving a sizeable French-speaking population under British control) and Louisiana to Spain. Spain ceded Florida to Britain. Along with its victory over France in India, the Seven Years' War therefore left Britain as the world's most powerful maritime power. Loss of the Thirteen American Colonies. During the 1760s and early 1770s, relations between the Thirteen Colonies and Britain became increasingly strained, primarily because of resentment of the British Parliament's attempts to govern and tax American colonists without their consent. This was summarised at the time by the colonists' slogan "No taxation without representation", a perceived violation of the guaranteed Rights of Englishmen. The American Revolution began with a rejection of Parliamentary authority and moves towards self-government. In response, Britain sent troops to reimpose direct rule, leading to the outbreak of war in 1775. The following year, in 1776, the Second Continental Congress issued the Declaration of Independence proclaiming the colonies' sovereignty from the British Empire as the new United States of America. The entry of French and Spanish forces into the war tipped the military balance in the Americans' favour and after a decisive defeat at Yorktown in 1781, Britain began negotiating peace terms. American independence was acknowledged at the Peace of Paris in 1783. The loss of such a large portion of British America, at the time Britain's most populous overseas possession, is seen by some historians as the event defining the transition between the first and second empires, in which Britain shifted its attention away from the Americas to Asia, the Pacific and later Africa. Adam Smith's "Wealth of Nations", published in 1776, had argued that colonies were redundant, and that free trade should replace the old mercantilist policies that had characterised the first period of colonial expansion, dating back to the protectionism of Spain and Portugal. The growth of trade between the newly independent United States and Britain after 1783 seemed to confirm Smith's view that political control was not necessary for economic success. The war to the south influenced British policy in Canada, where between 40,000 and 100,000 defeated Loyalists had migrated from the new United States following independence. The 14,000 Loyalists who went to the Saint John and Saint Croix river valleys, then part of Nova Scotia, felt too far removed from the provincial government in Halifax, so London split off New Brunswick as a separate colony in 1784. The Constitutional Act 1791 created the provinces of Upper Canada (mainly English speaking) and Lower Canada (mainly French-speaking) to defuse tensions between the French and British communities, and implemented governmental systems similar to those employed in Britain, with the intention of asserting imperial authority and not allowing the sort of popular control of government that was perceived to have led to the American Revolution. Tensions between Britain and the United States escalated again during the Napoleonic Wars, as Britain tried to cut off American trade with France and boarded American ships to impress men into the Royal Navy. The United States Congress declared war, the War of 1812, and invaded Canadian territory. In response, Britain invaded the US, but the pre-war boundaries were reaffirmed by the 1814 Treaty of Ghent, ensuring Canada's future would be separate from that of the United States. British Empire (1783–1815). Exploration of the Pacific. Since 1718, transportation to the American colonies had been a penalty for various offences in Britain, with approximately one thousand convicts transported per year. Forced to find an alternative location after the loss of the Thirteen Colonies in 1783, the British government looked for an alternative, eventually turning to Australia. On his first of three voyages commissioned by the government, James Cook reached New Zealand in October 1769. He was the first European to circumnavigate and map the country. From the late 18th century, the country was regularly visited by explorers and other sailors, missionaries, traders and adventurers but no attempt was made to settle the country or establish possession. The coast of Australia had been discovered for Europeans by the Dutch in 1606, but there was no attempt to colonise it. In 1770, after leaving New Zealand, James Cook charted the eastern coast, claimed the continent for Britain, and named it New South Wales. In 1778, Joseph Banks, Cook's botanist on the voyage, presented evidence to the government on the suitability of Botany Bay for the establishment of a penal settlement, and in 1787 the first shipment of convicts set sail, arriving in 1788. Unusually, Australia was claimed through proclamation. Indigenous Australians were considered too uncivilised to require treaties, and colonisation brought disease and violence that together with the deliberate dispossession of land and culture were devastating to these peoples. Britain continued to transport convicts to New South Wales until 1840, to Tasmania until 1853 and to Western Australia until 1868. The Australian colonies became profitable exporters of wool and gold, mainly because of the Victorian gold rush, making its capital Melbourne for a time the richest city in the world. The British also expanded their mercantile interests in the North Pacific. Spain and Britain had become rivals in the area, culminating in the Nootka Crisis in 1789. Both sides mobilised for war, but when France refused to support Spain it was forced to back down, leading to the Nootka Convention. The outcome was a humiliation for Spain, which practically renounced all sovereignty on the North Pacific coast. This opened the way to British expansion in the area, and a number of expeditions took place; firstly a naval expedition led by George Vancouver which explored the inlets around the Pacific North West, particularly around Vancouver Island. On land, expeditions sought to discover a river route to the Pacific for the extension of the North American fur trade. Alexander Mackenzie of the North West Company led the first, starting out in 1792, and a year later he became the first European to reach the Pacific overland north of the Rio Grande, reaching the ocean near present-day Bella Coola. This preceded the Lewis and Clark Expedition by twelve years. Shortly thereafter, Mackenzie's companion, John Finlay, founded the first permanent European settlement in British Columbia, Fort St. John. The North West Company sought further exploration and backed expeditions by David Thompson, starting in 1797, and later by Simon Fraser. These pushed into the wilderness territories of the Rocky Mountains and Interior Plateau to the Strait of Georgia on the Pacific Coast, expanding British North America westward. Continued conquest in India. The East India Company fought a series of Anglo-Mysore wars in Southern India with the Sultanate of Mysore under Hyder Ali and then Tipu Sultan. Defeats in the First Anglo-Mysore war and stalemate in the Second were followed by victories in the Third and the Fourth. Following Tipu Sultan's death in the fourth war in the Siege of Seringapatam (1799), the kingdom became a protectorate of the company. The East India Company fought three Anglo-Maratha Wars with the Maratha Confederacy. The First Anglo-Maratha War ended in 1782 with a restoration of the pre-war "status quo". The Second and Third Anglo-Maratha wars resulted in British victories. After the surrender of Peshwa Bajirao II in 1818, the East India Company acquired control of a large majority of the Indian subcontinent. Wars with France. Britain was challenged again by France under Napoleon, in a struggle that, unlike previous wars, represented a contest of ideologies between the two nations. It was not only Britain's position on the world stage that was at risk: Napoleon threatened to invade Britain itself, just as his armies had overrun many countries of continental Europe. The Napoleonic Wars were therefore ones in which Britain invested large amounts of capital and resources to win. French ports were blockaded by the Royal Navy, which won a decisive victory over a French Imperial Navy-Spanish Navy fleet at the Battle of Trafalgar in 1805. Overseas colonies were attacked and occupied, including those of the Netherlands, which was annexed by Napoleon in 1810. France was finally defeated by a coalition of European armies in 1815. Britain was again the beneficiary of peace treaties: France ceded the Ionian Islands, Malta (which it had occupied in 1798), Mauritius, St Lucia, the Seychelles, and Tobago; Spain ceded Trinidad; the Netherlands ceded Guiana, Ceylon and the Cape Colony, while the Danish ceded Heligoland. Britain returned Guadeloupe, Martinique, French Guiana, and Réunion to France; Menorca to Spain; Danish West Indies to Denmark and Java and Suriname to the Netherlands. Abolition of slavery. With the advent of the Industrial Revolution, goods produced by slavery became less important to the British economy. Added to this was the cost of suppressing regular slave rebellions. With support from the British abolitionist movement, Parliament enacted the Slave Trade Act in 1807, which abolished the slave trade in the empire. In 1808, Sierra Leone Colony was designated an official British colony for freed slaves. Parliamentary reform in 1832 saw the influence of the West India Committee decline. The Slavery Abolition Act, passed the following year, abolished slavery in the British Empire on 1 August 1834, finally bringing the empire into line with the law in the UK (with the exception of the territories administered by the East India Company and Ceylon, where slavery was ended in 1844). Under the Act, slaves were granted full emancipation after a period of four to six years of "apprenticeship". Facing further opposition from abolitionists, the apprenticeship system was abolished in 1838. The British government compensated slave-owners. Britain's imperial century (1815–1914). Between 1815 and 1914, a period referred to as Britain's "imperial century" by some historians, around of territory and roughly 400 million people were added to the British Empire. Victory over Napoleon left Britain without any serious international rival, other than Russia in Central Asia. Unchallenged at sea, Britain adopted the role of global policeman, a state of affairs later known as the "Pax Britannica", and a foreign policy of "splendid isolation". Alongside the formal control it exerted over its own colonies, Britain's dominant position in world trade meant that it effectively controlled the economies of many countries, such as China, Argentina and Siam, which has been described by some historians as an "Informal Empire". British imperial strength was underpinned by the steamship and the telegraph, new technologies invented in the second half of the 19th century, allowing it to control and defend the empire. By 1902, the British Empire was linked together by a network of telegraph cables, called the All Red Line. East India Company rule and the British Raj in India. The East India Company drove the expansion of the British Empire in Asia. The company's army had first joined forces with the Royal Navy during the Seven Years' War, and the two continued to co-operate in arenas outside India: the eviction of the French from Egypt (1799), the capture of Java from the Netherlands (1811), the acquisition of Penang Island (1786), Singapore (1819) and Malacca (1824), and the defeat of Burma (1826). From its base in India, the company had been engaged in an increasingly profitable opium export trade to Qing China since the 1730s. This trade, illegal since it was outlawed by China in 1729, helped reverse the trade imbalances resulting from the British imports of tea, which saw large outflows of silver from Britain to China. In 1839, the confiscation by the Chinese authorities at Canton of 20,000 chests of opium led Britain to attack China in the First Opium War, and resulted in the seizure by Britain of Hong Kong Island, at that time a minor settlement, and other treaty ports including Shanghai. During the late 18th and early 19th centuries, the British Crown began to assume an increasingly large role in the affairs of the company. A series of acts of Parliament were passed, including the Regulating Act 1773, East India Company Act 1784 and the Charter Act 1813 which regulated the company's affairs and established the sovereignty of the Crown over the territories that it had acquired. The company's eventual end was precipitated by the Indian Rebellion in 1857, a conflict that had begun with the mutiny of sepoys, Indian troops under British officers and discipline. The rebellion took six months to suppress, with heavy loss of life on both sides. The following year the British government dissolved the company and assumed direct control over India through the Government of India Act 1858, establishing the British Raj, where an appointed governor-general administered India and Queen Victoria was crowned the Empress of India. India became the empire's most valuable possession, "the Jewel in the Crown", and was the most important source of Britain's strength. A series of serious crop failures in the late 19th century led to widespread famines on the subcontinent in which it is estimated that over 15 million people died. The East India Company had failed to implement any coordinated policy to deal with the famines during its period of rule. Later, under direct British rule, commissions were set up after each famine to investigate the causes and implement new policies, which took until the early 1900s to have an effect. New Zealand. On each of his three voyages to the Pacific between 1769 and 1777, James Cook visited New Zealand. He was followed by an assortment of Europeans and Americans which including whalers, sealers, escaped convicts from New South Wales, missionaries and adventurers. Initially, contact with the indigenous Māori people was limited to the trading of goods, although interaction increased during the early decades of the 19th century with many trading and missionary stations being set up, especially in the north. The first of several Church of England missionaries arrived in 1814 and as well as their missionary role, they soon become the only form of European authority in a land that was not subject to British jurisdiction: the closest authority being the New South Wales governor in Sydney. The sale of weapons to Māori resulted from 1818 on in the intertribal warfare of the Musket Wars, with devastating consequences for the Māori population. The UK government finally decided to act, dispatching Captain William Hobson with instructions to take formal possession after obtaining native consent. There was no central Māori authority able to represent all New Zealand so, on 6 February 1840, Hobson and many Māori chiefs signed the Treaty of Waitangi in the Bay of Islands; most other chiefs signing in stages over the following months. William Hobson declared British sovereignty over all New Zealand on 21 May 1840, over the North Island by cession and over the South Island by discovery (the island was sparsely populated and deemed terra nullius). Hobson became Lieutenant-Governor, subject to Governor Sir George Gipps in Sydney, with British possession of New Zealand initially administered from Australia as a dependency of the New South Wales colony. From 16 June 1840 New South Wales laws applied in New Zealand. This transitional arrangement ended with the Charter for Erecting the Colony of New Zealand on 16 November 1840. The Charter stated that New Zealand would be established as a separate Crown colony on 3 May 1841 with Hobson as its governor. Rivalry with Russia. During the 19th century, Britain and the Russian Empire vied to fill the power vacuums that had been left by the declining Ottoman Empire, Qajar dynasty and Qing dynasty. This rivalry in Central Asia came to be known as the "Great Game". As far as Britain was concerned, defeats inflicted by Russia on Persia and Turkey demonstrated its imperial ambitions and capabilities and stoked fears in Britain of an overland invasion of India. In 1839, Britain moved to pre-empt this by invading Afghanistan, but the First Anglo-Afghan War was a disaster for Britain. When Russia invaded the Ottoman Balkans in 1853, fears of Russian dominance in the Mediterranean and the Middle East led Britain and France to enter the war in support of the Ottoman Empire and invade the Crimean Peninsula to destroy Russian naval capabilities. The ensuing Crimean War (1854–1856), which involved new techniques of modern warfare, was the only global war fought between Britain and another imperial power during the "Pax Britannica" and was a resounding defeat for Russia. The situation remained unresolved in Central Asia for two more decades, with Britain annexing Baluchistan in 1876 and Russia annexing Kirghizia, Kazakhstan, and Turkmenistan. For a while, it appeared that another war would be inevitable, but the two countries reached an agreement on their respective spheres of influence in the region in 1878 and on all outstanding matters in 1907 with the signing of the Anglo-Russian Entente. The destruction of the Imperial Russian Navy by the Imperial Japanese Navy at the Battle of Tsushima during the Russo-Japanese War of 1904–1905 limited its threat to the British. Cape to Cairo. The Dutch East India Company had founded the Dutch Cape Colony on the southern tip of Africa in 1652 as a way station for its ships travelling to and from its colonies in the East Indies. Britain formally acquired the colony, and its large Afrikaner (or Boer) population in 1806, having occupied it in 1795 to prevent its falling into French hands during the Flanders Campaign. British immigration to the Cape Colony began to rise after 1820, and pushed thousands of Boers, resentful of British rule, northwards to found their own—mostly short-lived—independent republics, during the Great Trek of the late 1830s and early 1840s. In the process the Voortrekkers clashed repeatedly with the British, who had their own agenda with regard to colonial expansion in South Africa and to the various native African polities, including those of the Sotho people and the Zulu Kingdom. Eventually, the Boers established two republics that had a longer lifespan: the South African Republic or Transvaal Republic (1852–1877; 1881–1902) and the Orange Free State (1854–1902). In 1902 Britain occupied both republics, concluding a treaty with the two Boer Republics following the Second Boer War (1899–1902). In 1869 the Suez Canal opened under Napoleon III, linking the Mediterranean Sea with the Indian Ocean. Initially the Canal was opposed by the British; but once opened, its strategic value was quickly recognised and became the "jugular vein of the Empire". In 1875, the Conservative government of Benjamin Disraeli bought the indebted Egyptian ruler Isma'il Pasha's 44 per cent shareholding in the Suez Canal for £4 million (equivalent to £ in ). Although this did not grant outright control of the strategic waterway, it did give Britain leverage. Joint Anglo-French financial control over Egypt ended in outright British occupation in 1882. Although Britain controlled the Khedivate of Egypt into the 20th century, it was officially a vassal state of the Ottoman Empire and not part of the British Empire. The French were still majority shareholders and attempted to weaken the British position, but a compromise was reached with the 1888 Convention of Constantinople, which made the Canal officially neutral territory. With competitive French, Belgian and Portuguese activity in the lower Congo River region undermining orderly colonisation of tropical Africa, the Berlin Conference of 1884–85 was held to regulate the competition between the European powers in what was called the "Scramble for Africa" by defining "effective occupation" as the criterion for international recognition of territorial claims. The scramble continued into the 1890s, and caused Britain to reconsider its decision in 1885 to withdraw from Sudan. A joint force of British and Egyptian troops defeated the Mahdist Army in 1896 and rebuffed an attempted French invasion at Fashoda in 1898. Sudan was nominally made an Anglo-Egyptian condominium, but a British colony in reality. British gains in Southern and East Africa prompted Cecil Rhodes, pioneer of British expansion in Southern Africa, to urge a "Cape to Cairo" railway linking the strategically important Suez Canal to the mineral-rich south of the continent. During the 1880s and 1890s, Rhodes, with his privately owned British South Africa Company, occupied and annexed territories named after him, Rhodesia. Changing status of the white colonies. The path to independence for the white colonies of the British Empire began with the 1839 Durham Report, which proposed unification and self-government for Upper and Lower Canada, as a solution to political unrest which had erupted in armed rebellions in 1837. This began with the passing of the Act of Union in 1840, which created the Province of Canada. Responsible government was first granted to Nova Scotia in 1848, and was soon extended to the other British North American colonies. With the passage of the British North America Act, 1867 by the British Parliament, the Province of Canada, New Brunswick and Nova Scotia were formed into Canada, a confederation enjoying full self-government with the exception of international relations. Australia and New Zealand achieved similar levels of self-government after 1900, with the Australian colonies federating in 1901. The term "dominion status" was officially introduced at the 1907 Imperial Conference. As the dominions gained greater autonomy, they would come to be recognized as distinct realms of the empire with unique customs and symbols of their own. Imperial identity, through imagery such as patriotic artworks and banners, began developing into a form that attempted to be more inclusive by showcasing the empire as a family of newly birthed nations with common roots. The last decades of the 19th century saw concerted political campaigns for Irish home rule. Ireland had been united with Britain into the United Kingdom of Great Britain and Ireland with the Act of Union 1800 after the Irish Rebellion of 1798, and had suffered a severe famine between 1845 and 1852. Home rule was supported by the British prime minister, William Gladstone, who hoped that Ireland might follow in Canada's footsteps as a Dominion within the empire, but his 1886 Home Rule bill was defeated in Parliament. Although the bill, if passed, would have granted Ireland less autonomy within the UK than the Canadian provinces had within their own federation, many MPs feared that a partially independent Ireland might pose a security threat to Great Britain or mark the beginning of the break-up of the empire. A second Home Rule bill was defeated for similar reasons. A third bill was passed by Parliament in 1914, but not implemented because of the outbreak of the First World War leading to the 1916 Easter Rising. World wars (1914–1945). By the turn of the 20th century, fears had begun to grow in Britain that it would no longer be able to defend the metropole and the entirety of the empire while at the same time maintaining the policy of "splendid isolation". Germany was rapidly rising as a military and industrial power and was now seen as the most likely opponent in any future war. Recognising that it was overstretched in the Pacific and threatened at home by the Imperial German Navy, Britain formed an alliance with Japan in 1902 and with its old enemies France and Russia in 1904 and 1907, respectively. First World War. Britain's fears of war with Germany were realised in 1914 with the outbreak of the First World War. Britain quickly invaded and occupied most of Germany's overseas colonies in Africa. In the Pacific, Australia and New Zealand occupied German New Guinea and German Samoa respectively. Plans for a post-war division of the Ottoman Empire, which had joined the war on Germany's side, were secretly drawn up by Britain and France under the 1916 Sykes–Picot Agreement. This agreement was not divulged to the Sharif of Mecca, who the British had been encouraging to launch an Arab revolt against their Ottoman rulers, giving the impression that Britain was supporting the creation of an independent Arab state. The British declaration of war on Germany and its allies committed the colonies and Dominions, which provided invaluable military, financial and material support. Over 2.5 million men served in the armies of the Dominions, as well as many thousands of volunteers from the Crown colonies. The contributions of Australian and New Zealand troops during the 1915 Gallipoli Campaign against the Ottoman Empire had a great impact on the national consciousness at home and marked a watershed in the transition of Australia and New Zealand from colonies to nations in their own right. The countries continue to commemorate this occasion on Anzac Day. Canadians viewed the Battle of Vimy Ridge in a similar light. The important contribution of the Dominions to the war effort was recognised in 1917 by British prime minister David Lloyd George when he invited each of the Dominion prime ministers to join an Imperial War Cabinet to co-ordinate imperial policy. Under the terms of the concluding Treaty of Versailles signed in 1919, the empire reached its greatest extent with the addition of and 13 million new subjects. The colonies of Germany and the Ottoman Empire were distributed to the Allied powers as League of Nations mandates. Britain gained control of Palestine, Transjordan, Iraq, parts of Cameroon and Togoland, and Tanganyika. The Dominions themselves acquired mandates of their own: the Union of South Africa gained South West Africa (modern-day Namibia), Australia gained New Guinea, and New Zealand Western Samoa. Nauru was made a combined mandate of Britain and the two Pacific Dominions. Inter-war period. The changing world order that the war had brought about, in particular the growth of the United States and Japan as naval powers, and the rise of independence movements in India and Ireland, caused a major reassessment of British imperial policy. Forced to choose between alignment with the United States or Japan, Britain opted not to renew its Anglo-Japanese Alliance and instead signed the 1922 Washington Naval Treaty, where Britain accepted naval parity with the United States. This decision was the source of much debate in Britain during the 1930s as militaristic governments took hold in Germany and Japan helped in part by the Great Depression, for it was feared that the empire could not survive a simultaneous attack by both nations. The issue of the empire's security was a serious concern in Britain, as it was vital to the British economy. In 1919, the frustrations caused by delays to Irish home rule led the MPs of Sinn Féin, a pro-independence party that had won a majority of the Irish seats in the 1918 British general election, to establish an independent parliament in Dublin, at which Irish independence was declared. The Irish Republican Army simultaneously began a guerrilla war against the British administration. The Irish War of Independence ended in 1921 with a stalemate and the signing of the Anglo-Irish Treaty, creating the Irish Free State, a Dominion within the British Empire, with effective internal independence but still constitutionally linked with the British Crown. Northern Ireland, consisting of six of the 32 Irish counties which had been established as a devolved region under the 1920 Government of Ireland Act, immediately exercised its option under the treaty to retain its existing status within the United Kingdom. A similar struggle began in India when the Government of India Act 1919 failed to satisfy the demand for independence. Concerns over communist and foreign plots following the Ghadar conspiracy ensured that war-time strictures were renewed by the Rowlatt Acts. This led to tension, particularly in the Punjab region, where repressive measures culminated in the Amritsar Massacre. In Britain, public opinion was divided over the morality of the massacre, between those who saw it as having saved India from anarchy, and those who viewed it with revulsion. The non-cooperation movement was called off in March 1922 following the Chauri Chaura incident, and discontent continued to simmer for the next 25 years. In 1922, Egypt, which had been declared a British protectorate at the outbreak of the First World War, was granted formal independence, though it continued to be a British client state until 1954. British troops remained stationed in Egypt until the signing of the Anglo-Egyptian Treaty in 1936, under which it was agreed that the troops would withdraw but continue to occupy and defend the Suez Canal zone. In return, Egypt was assisted in joining the League of Nations. Iraq, a British mandate since 1920, gained membership of the League in its own right after achieving independence from Britain in 1932. In Palestine, Britain was presented with the problem of mediating between the Arabs and increasing numbers of Jews. The Balfour Declaration, which had been incorporated into the terms of the mandate, stated that a national home for the Jewish people would be established in Palestine, and Jewish immigration allowed up to a limit that would be determined by the mandatory power. This led to increasing conflict with the Arab population, who openly revolted in 1936. As the threat of war with Germany increased during the 1930s, Britain judged the support of Arabs as more important than the establishment of a Jewish homeland, and shifted to a pro-Arab stance, limiting Jewish immigration and in turn triggering a Jewish insurgency. The right of the Dominions to set their own foreign policy, independent of Britain, was recognised at the 1923 Imperial Conference. Britain's request for military assistance from the Dominions at the outbreak of the Chanak Crisis the previous year had been turned down by Canada and South Africa, and Canada had refused to be bound by the 1923 Treaty of Lausanne. After pressure from the Irish Free State and South Africa, the 1926 Imperial Conference issued the Balfour Declaration of 1926, declaring Britain and the Dominions to be "autonomous Communities within the British Empire, equal in status, in no way subordinate one to another" within a "British Commonwealth of Nations". This declaration was given legal substance under the 1931 Statute of Westminster. The parliaments of Canada, Australia, New Zealand, the Union of South Africa, the Irish Free State and Newfoundland were now independent of British legislative control, they could nullify British laws and Britain could no longer pass laws for them without their consent. Newfoundland reverted to colonial status in 1933, suffering from financial difficulties during the Great Depression. In 1937 the Irish Free State introduced a republican constitution renaming itself "Ireland". Second World War. Britain's declaration of war against Nazi Germany in September 1939 included the Crown colonies and India but did not automatically commit the Dominions of Australia, Canada, New Zealand, Newfoundland and South Africa. All soon declared war on Germany. While Britain continued to regard Ireland as still within the British Commonwealth, Ireland chose to remain legally neutral throughout the war. After the Fall of France in June 1940, Britain and the empire stood alone against Germany, until the German invasion of Greece on 7 April 1941. British Prime Minister Winston Churchill successfully lobbied President Franklin D. Roosevelt for military aid from the United States, but Roosevelt was not yet ready to ask Congress to commit the country to war. In August 1941, Churchill and Roosevelt met and signed the Atlantic Charter, which included the statement that "the rights of all peoples to choose the form of government under which they live" should be respected. This wording was ambiguous as to whether it referred to European countries invaded by Germany and Italy, or the peoples colonised by European nations, and would later be interpreted differently by the British, Americans, and nationalist movements. Nevertheless, Churchill rejected its universal applicability when it came to the self-determination of subject nations including the British Indian Empire. Churchill further added that he did not become Prime Minister to oversee the liquidation of the empire. For Churchill, the entry of the United States into the war was the "greatest joy". He felt that Britain was now assured of victory, but failed to recognise that the "many disasters, immeasurable costs and tribulations [which he knew] lay ahead" in December 1941 would have permanent consequences for the future of the empire. The manner in which British forces were rapidly defeated in the Far East irreversibly harmed Britain's standing and prestige as an imperial power, including, particularly, the Fall of Singapore, which had previously been hailed as an impregnable fortress and the eastern equivalent of Gibraltar. The realisation that Britain could not defend its entire empire pushed Australia and New Zealand, which now appeared threatened by Japanese forces, into closer ties with the United States and, ultimately, the 1951 ANZUS Pact. The war weakened the empire in other ways: undermining Britain's control of politics in India, inflicting long-term economic damage, and irrevocably changing geopolitics by pushing the Soviet Union and the United States to the centre of the global stage. Decolonisation and decline (1945–1997). Though Britain and the empire emerged victorious from the Second World War, the effects of the conflict were profound, both at home and abroad. Much of Europe, a continent that had dominated the world for several centuries, was in ruins, and host to the armies of the United States and the Soviet Union, who now held the balance of global power. Britain was left essentially bankrupt, with insolvency only averted in 1946 after the negotiation of a US$3.75 billion loan from the United States, the last instalment of which was repaid in 2006. At the same time, anti-colonial movements were on the rise in the colonies of European nations. The situation was complicated further by the increasing Cold War rivalry of the United States and the Soviet Union. In principle, both nations were opposed to European colonialism. In practice, American anti-communism prevailed over anti-imperialism, and therefore the United States supported the continued existence of the British Empire to keep Communist expansion in check. At first, British politicians believed it would be possible to maintain Britain's role as a world power at the head of a re-imagined Commonwealth, but by 1960 they were forced to recognise that there was an irresistible "wind of change" blowing. Their priorities changed to maintaining an extensive zone of British influence and ensuring that stable, non-Communist governments were established in former colonies. In this context, while other European powers such as France and Portugal waged costly and unsuccessful wars to keep their empires intact, Britain generally adopted a policy of peaceful disengagement from its colonies, although violence occurred in Malaya, Kenya and Palestine. Between 1945 and 1965, the number of people under British rule outside the UK itself fell from 700 million to 5 million, 3 million of whom were in Hong Kong. Initial disengagement. The pro-decolonisation Labour government, elected at the 1945 general election and led by Clement Attlee, moved quickly to tackle the most pressing issue facing the empire: Indian independence. India's major political party—the Indian National Congress (led by Mahatma Gandhi) — had been campaigning for independence for decades, but disagreed with Muslim League (led by Muhammad Ali Jinnah) as to how it should be implemented. Congress favoured a unified secular Indian state, whereas the League, fearing domination by the Hindu majority, desired a separate Islamic state for Muslim-majority regions. Increasing civil unrest led Attlee to promise independence no later than 30 June 1948. When the urgency of the situation and risk of civil war became apparent, the newly appointed (and last) Viceroy, Lord Mountbatten, hastily brought forward the date to 15 August 1947. The borders drawn by the British to broadly partition India into Hindu and Muslim areas left tens of millions as minorities in the newly independent states of India and Pakistan. The princely states were provided with a choice to either remain independent or join India or Pakistan. Millions of Muslims crossed from India to Pakistan and Hindus vice versa, and violence between the two communities cost hundreds of thousands of lives. Burma, which had been administered as part of British India until 1937 gained independence the following year in 1948 along with Sri Lanka (formerly known as British Ceylon). India, Pakistan and Sri Lanka became members of the Commonwealth, while Burma chose not to join. That same year, the British Nationality Act was enacted, in hopes of strengthening and unifying the Commonwealth: it provided British citizenship and right of entry to all those living within its jurisdiction. The British Mandate in Palestine, where an Arab majority lived alongside a Jewish minority, presented the British with a similar problem to that of India. The matter was complicated by large numbers of Jewish refugees seeking to be admitted to Palestine following the Holocaust, while Arabs were opposed to the creation of a Jewish state. Frustrated by the intractability of the problem, attacks by Jewish paramilitary organisations and the increasing cost of maintaining its military presence, Britain announced in 1947 that it would withdraw in 1948 and leave the matter to the United Nations to solve. The UN General Assembly subsequently voted for a plan to partition Palestine into a Jewish and an Arab state. It was immediately followed by the outbreak of a civil war between the Arabs and Jews of Palestine, and British forces withdrew amid the fighting. The British Mandate for Palestine officially terminated at midnight on 15 May 1948 as the State of Israel declared independence and the 1948 Arab-Israeli War broke out, during which the territory of the former Mandate was partitioned between Israel and the surrounding Arab states. Amid the fighting, British forces continued to withdraw from Israel, with the last British troops departing from Haifa on 30 June 1948. Following the surrender of Japan in the Second World War, anti-Japanese resistance movements in Malaya turned their attention towards the British, who had moved to quickly retake control of the colony, valuing it as a source of rubber and tin. The fact that the guerrillas were primarily Malaysian Chinese Communists meant that the British attempt to quell the uprising was supported by the Muslim Malay majority, on the understanding that once the insurgency had been quelled, independence would be granted. The Malayan Emergency, as it was called, began in 1948 and lasted until 1960, but by 1957, Britain felt confident enough to grant independence to the Federation of Malaya within the Commonwealth. In 1963, the 11 states of the federation together with Singapore, Sarawak and North Borneo joined to form Malaysia, but in 1965 Chinese-majority Singapore was expelled from the union following tensions between the Malay and Chinese populations and became an independent city-state. Brunei, which had been a British protectorate since 1888, declined to join the union. Suez and its aftermath. In the 1951 general election, the Conservative Party returned to power in Britain under the leadership of Winston Churchill. Churchill and the Conservatives believed that Britain's position as a world power relied on the continued existence of the empire, with the base at the Suez Canal allowing Britain to maintain its pre-eminent position in the Middle East in spite of the loss of India. Churchill could not ignore Gamal Abdul Nasser's new revolutionary government of Egypt that had taken power in 1952, and the following year it was agreed that British troops would withdraw from the Suez Canal zone and that Sudan would be granted self-determination by 1955, with independence to follow Sudan was granted independence on 1 January 1956. In July 1956, Nasser unilaterally nationalised the Suez Canal. The response of Anthony Eden, who had succeeded Churchill as Prime Minister, was to collude with France to engineer an Israeli attack on Egypt that would give Britain and France an excuse to intervene militarily and retake the canal. Eden infuriated US President Dwight D. Eisenhower by his lack of consultation, and Eisenhower refused to back the invasion. Another of Eisenhower's concerns was the possibility of a wider war with the Soviet Union after it threatened to intervene on the Egyptian side. Eisenhower applied financial leverage by threatening to sell US reserves of the British pound and thereby precipitate a collapse of the British currency. Though the invasion force was militarily successful in its objectives, UN intervention and US pressure forced Britain into a humiliating withdrawal of its forces, and Eden resigned. The Suez Crisis very publicly exposed Britain's limitations to the world and confirmed Britain's decline on the world stage and its end as a first-rate power, demonstrating that henceforth it could no longer act without at least the acquiescence, if not the full support, of the United States. The events at Suez wounded British national pride, leading one Member of Parliament (MP) to describe it as "Britain's Waterloo" and another to suggest that the country had become an "American satellite". Margaret Thatcher later described the mindset she believed had befallen Britain's political leaders after Suez where they "went from believing that Britain could do anything to an almost neurotic belief that Britain could do nothing", from which Britain did not recover until the successful recapture of the Falkland Islands from Argentina in 1982. While the Suez Crisis caused British power in the Middle East to weaken, it did not collapse. Britain again deployed its armed forces to the region, intervening in Oman (1957), Jordan (1958) and Kuwait (1961), though on these occasions with American approval, as the new Prime Minister Harold Macmillan's foreign policy was to remain firmly aligned with the United States. Although Britain granted Kuwait independence in 1961, it continued to maintain a military presence in the Middle East for another decade. On 16 January 1968, a few weeks after the devaluation of the pound, Prime Minister Harold Wilson and his Defence Secretary Denis Healey announced that British Armed Forces troops would be withdrawn from major military bases East of Suez, which included the ones in the Middle East, and primarily from Malaysia and Singapore by the end of 1971, instead of 1975 as earlier planned. By that time over 50,000 British military personnel were still stationed in the Far East, including 30,000 in Singapore. The British granted independence to the Maldives in 1965 but continued to station a garrison there until 1976, withdrew from Aden in 1967, and granted independence to Bahrain, Qatar, and the United Arab Emirates in 1971. Wind of change. Macmillan gave a speech in Cape Town, South Africa in February 1960 where he spoke of "the wind of change blowing through this continent". Macmillan wished to avoid the same kind of colonial war that France was fighting in Algeria, and under his premiership decolonisation proceeded rapidly. To the three colonies that had been granted independence in the 1950s—Sudan, the Gold Coast and Malaya—were added nearly ten times that number during the 1960s. Owing to the rapid pace of decolonisation during this period, the cabinet post of Secretary of State for the Colonies was abolished in 1966, along with the Colonial Office, which merged with the Commonwealth Relations Office to form the Foreign and Commonwealth Office (now the Foreign, Commonwealth and Development Office) in October 1968. Britain's remaining colonies in Africa, except for self-governing Southern Rhodesia, were all granted independence by 1968. British withdrawal from the southern and eastern parts of Africa was not a peaceful process. From 1952 the Kenya Colony saw the eight-year long Mau Mau rebellion, in which tens of thousands of suspected rebels were interned by the colonial government in detention camps to suppress the rebellion and over 1000 convicts executed, with records systematically destroyed. Throughout the 1960s, the British government took a "No independence until majority rule" policy towards decolonising the empire, leading the white minority government of Southern Rhodesia to enact the 1965 Unilateral Declaration of Independence from Britain, resulting in a civil war that lasted until the British-mediated Lancaster House Agreement of 1979. The agreement saw the British Empire temporarily re-establish the Colony of Southern Rhodesia from 1979 to 1980 as a transitionary government to a majority rule Republic of Zimbabwe. This was the last British possession in Africa. In Cyprus, a guerrilla war waged by the Greek Cypriot organisation EOKA against British rule, was ended in 1959 by the London and Zürich Agreements, which resulted in Cyprus being granted independence in 1960. The UK retained the military bases of Akrotiri and Dhekelia as sovereign base areas. The Mediterranean colony of Malta was amicably granted independence from the UK in 1964 and became the country of Malta, though the idea had been raised in 1955 of integration with Britain. Most of the UK's Caribbean territories achieved independence after the departure in 1961 and 1962 of Jamaica and Trinidad from the West Indies Federation, established in 1958 in an attempt to unite the British Caribbean colonies under one government, but which collapsed following the loss of its two largest members. Jamaica attained independence in 1962, as did Trinidad and Tobago. Barbados achieved independence in 1966 and the remainder of the eastern Caribbean islands, including the Bahamas, in the 1970s and 1980s, but Anguilla and the Turks and Caicos Islands opted to revert to British rule after they had already started on the path to independence. The British Virgin Islands, The Cayman Islands and Montserrat opted to retain ties with Britain, while Guyana achieved independence in 1966. Britain's last colony on the American mainland, British Honduras, became a self-governing colony in 1964 and was renamed Belize in 1973, achieving full independence in 1981. A dispute with Guatemala over claims to Belize was left unresolved. British Overseas Territories in the Pacific acquired independence in the 1970s beginning with Fiji in 1970 and ending with Vanuatu in 1980. Vanuatu's independence was delayed because of political conflict between English and French-speaking communities, as the islands had been jointly administered as a condominium with France. Fiji, Papua New Guinea, Solomon Islands and Tuvalu became Commonwealth realms. End of empire. By 1981, aside from a scattering of islands and outposts, the process of decolonisation that had begun after the Second World War was largely complete. In 1982, Britain's resolve in defending its remaining overseas territories was tested when Argentina invaded the Falkland Islands, acting on a long-standing claim that dated back to the Spanish Empire. Britain's successful military response to retake the Falkland Islands during the ensuing Falklands War contributed to reversing the downward trend in Britain's status as a world power. The 1980s saw Canada, Australia, and New Zealand sever their final constitutional links with Britain. Although granted legislative independence by the Statute of Westminster 1931, vestigial constitutional links had remained in place. The British Parliament retained the power to amend key Canadian constitutional statutes, meaning that an act of the British Parliament was required to make certain changes to the Canadian Constitution. The British Parliament had the power to pass laws extending to Canada at Canadian request. Although no longer able to pass any laws that would apply to Australian Commonwealth law, the British Parliament retained the power to legislate for the individual Australian states. With regard to New Zealand, the British Parliament retained the power to pass legislation applying to New Zealand with the New Zealand Parliament's consent. In 1982, the last legal link between Canada and Britain was severed by the Canada Act 1982, which was passed by the British parliament, formally patriating the Canadian Constitution. The act ended the need for British involvement in changes to the Canadian constitution. Similarly, the Australia Act 1986 (effective 3 March 1986) severed the constitutional link between Britain and the Australian states, while New Zealand's Constitution Act 1986 (effective 1 January 1987) reformed the constitution of New Zealand to sever its constitutional link with Britain. On 1 January 1984, Brunei, Britain's last remaining Asian protectorate, was granted full independence. Independence had been delayed due to the opposition of the Sultan, who had preferred British protection. In September 1982 the Prime Minister, Margaret Thatcher, travelled to Beijing to negotiate with the Chinese Communist government, on the future of Britain's last major and most populous overseas territory, Hong Kong. Under the terms of the 1842 Treaty of Nanking and 1860 Convention of Peking, Hong Kong Island and Kowloon Peninsula had been respectively ceded to Britain "in perpetuity", but the majority of the colony consisted of the New Territories, which had been acquired under a 99-year lease in 1898, due to expire in 1997. Thatcher, seeing parallels with the Falkland Islands, initially wished to hold Hong Kong and proposed British administration with Chinese sovereignty, though this was rejected by China. A deal was reached in 1984—under the terms of the Sino-British Joint Declaration, Hong Kong would become a special administrative region of the People's Republic of China. The handover ceremony in 1997 marked for many, including King Charles III, then Prince of Wales, who was in attendance, "the end of Empire", though many British territories that are remnants of the empire still remain. Legacy. Britain retains sovereignty over 14 territories outside the British Isles. In 1983, the British Nationality Act 1981 renamed the existing Crown Colonies as "British Dependent Territories", and in 2002 they were renamed the British Overseas Territories. Most former British colonies and protectorates are members of the Commonwealth of Nations, a voluntary association of equal members, comprising a population of around 2.2 billion people. The United Kingdom and 14 other countries, all collectively known as the Commonwealth realms, voluntarily continue to share the same person— King Charles III—as their respective head of state. These 15 nations are distinct and equal legal entities: the United Kingdom, Australia, Canada, New Zealand, Antigua and Barbuda, The Bahamas, Belize, Grenada, Jamaica, Papua New Guinea, Saint Kitts and Nevis, Saint Lucia, Saint Vincent and the Grenadines, Solomon Islands and Tuvalu. During the colonial era, emphasis was given to study of the classical Greco-Roman heritage and their experience with empire, aiming to parse how that heritage could be applied to improve the future of the colonies. American hegemony, which throughout its early rise had challenged British claims of being the "New Rome", became the successor to British dominance in the mid-20th century; the two countries' historical ties and wartime collaboration supported a peaceful handoff of power after World War II. As for the United Kingdom itself, British views of the former Empire are more positive than is the case with other post-imperial nations; discourse around the former Empire has continued to impact the nation's present-day understanding of itself, as seen in the debate leading up to its decision to leave the European Union in 2016. Decades, and in some cases centuries, of British rule and emigration have left their mark on the independent nations that rose from the British Empire. The empire established the use of the English language in regions around the world. Today it is the primary language of up to 460 million people and is spoken by about 1.5 billion as a first, second or foreign language. It has also significantly influenced other languages. Individual and team sports developed in Britain, particularly football, cricket, lawn tennis, and golf were exported. Some sports were also invented or standardised in the former colonies, such as badminton, polo, and snooker in India (see also: Sport in British India). British missionaries who travelled around the globe often in advance of soldiers and civil servants spread Protestantism (including Anglicanism) to all continents. The British Empire provided refuge for religiously persecuted continental Europeans for hundreds of years. British educational institutions also remain popular in the present day, in part due to the importance of the English language and similarity of British curriculums to those in the former colonies. Political boundaries drawn by the British did not always reflect homogeneous ethnicities or religions, contributing to conflicts in formerly colonised areas. The British Empire was responsible for large migrations of peoples (see also: Commonwealth diaspora). Millions left the British Isles, with the founding settler colonist populations of the United States, Canada, Australia and New Zealand coming mainly from Britain and Ireland. Millions of people moved between British colonies, with large numbers of South Asian people emigrating to other parts of the empire, such as Malaysia and Fiji, and Overseas Chinese people to Malaysia, Singapore and the Caribbean; about half of all modern immigration to the Commonwealth nations continues to occur between them. The demographics of the United Kingdom changed after the Second World War owing to immigration to Britain from its former colonies. In the 19th century, innovation in Britain led to revolutionary changes in manufacturing, the development of factory systems, and the growth of transportation by railway and steamship. Debate has also occurred as to what extent the Industrial Revolution, originating from the United Kingdom, was facilitated by or dependent on imperialism. British colonial architecture, such as in churches, railway stations and government buildings, can be seen in many cities that were once part of the British Empire; Western technologies and architecture had been globalised in part due to the Empire's military and administrative requirements. Integration of former colonies into the global economy was also a major legacy. The British choice of system of measurement, the imperial system, continues to be used in some countries in various ways. The convention of driving on the left-hand side of the road has been retained in much of the former empire. The Westminster system of parliamentary democracy has served as the template for the governments of many former colonies, and English common law for legal systems. It has been observed that almost every former colony that emerged as an independent democratic state is a former British colony, though this correlation greatly declines in strength after 30 years of an ex-colony's independence. International commercial contracts are often based on English common law. The British Judicial Committee of the Privy Council still serves as the highest court of appeal for twelve former colonies. Interpretations of Empire. Historians' approaches to understanding the British Empire are diverse and evolving. Two key sites of debate over recent decades have been the impact of post-colonial studies, which seek to critically re-evaluate the history of imperialism, and the continued relevance of historians Ronald Robinson and John Gallagher, whose work greatly influenced imperial historiography during the 1950s and 1960s. In addition, differing assessments of the empire's legacy remain relevant to debates over recent history and politics, such as the Anglo-American invasions of Iraq and Afghanistan, as well as Britain's role and identity in the contemporary world. Historians such as Caroline Elkins have argued against perceptions of the British Empire as a primarily liberalising and modernising enterprise, criticising its widespread use of violence and emergency laws to maintain power. Common criticisms of the empire include the use of detention camps in its colonies, massacres of indigenous peoples, and famine-response policies. Some scholars, including Amartya Sen, assert that British policies worsened the famines in India that killed millions during British rule. Conversely, historians such as Niall Ferguson say that the economic and institutional development the British Empire brought resulted in a net benefit to its colonies. Other historians treat its legacy as varied and ambiguous. Public attitudes towards the empire within 21st-century Britain have been broadly positive although sentiment towards the Commonwealth has been one of apathy and decline.
4726
47694944
https://en.wikipedia.org/wiki?curid=4726
Batman (1989 film)
Batman is a 1989 superhero film based on the DC Comics character of the same name, created by Bob Kane and Bill Finger. Directed by Tim Burton, it is the first installment of Warner Bros.' initial "Batman" film series. The film was produced by Jon Peters and Peter Guber and stars Jack Nicholson, Michael Keaton, Kim Basinger, Robert Wuhl, Pat Hingle, Billy Dee Williams, Michael Gough, and Jack Palance. The film takes place early in the war on crime of the title character (Keaton) and depicts his conflict with his archenemy the Joker (Nicholson). After Burton was hired as director in 1986, Steve Englehart and Julie Hickson wrote film treatments before Sam Hamm wrote the first screenplay. "Batman" was not greenlit until after the success of Burton's "Beetlejuice" (1988). The tone and themes of the film were partly influenced by Alan Moore and Brian Bolland's "" and Frank Miller's "The Dark Knight Returns". The film primarily adapts and then diverges from the "Red Hood" origin story for the Joker, having Batman inadvertently cause gangster Jack Napier to fall into Axis Chemical acid, triggering his transformation into the psychotic Joker. Additionally, Batman co-creator Bob Kane worked as a consultant for the film. Numerous leading men were considered for the role of Batman before Keaton was cast. Keaton's casting was controversial since, by 1988, he had become typecast as a comedic actor and many observers doubted he could portray a serious role. Nicholson accepted the role of the Joker under strict conditions that dictated top billing, a portion of the film's earnings (including associated merchandise), and his own shooting schedule. Filming took place at Pinewood Studios from October 1988 to January 1989. The budget escalated from $30 million to $48 million, while the 1988 Writers Guild of America strike forced Hamm to drop out. Warren Skaaren did rewrites, with additional uncredited drafts done by Charles McKeown and Jonathan Gems. "Batman" was both critically and financially successful, earning over $400 million in box office totals. Critics and audiences particularly praised Nicholson and Keaton's performances, Burton's direction, the production design, and Elfman's score. It was the sixth-highest-grossing film in history at the time of its release. The film received several Saturn Award nominations and a Golden Globe nomination for Nicholson's performance, and won the Academy Award for Best Art Direction. The film was followed by three sequels: "Batman Returns" (1992), with both Burton and Keaton returning; "Batman Forever" (1995), which featured Val Kilmer in the lead role; and "Batman & Robin" (1997), which featured George Clooney in the role. Keaton eventually reprised the role of Batman in the DC Extended Universe film "The Flash" (2023). The film also led to the development of "" (1992–1995), which in turn began the DC Animated Universe of spin-off media, and has influenced Hollywood's modern marketing and development techniques of the superhero film genre. Plot. Newapaper reporter Alexander Knox and photojournalist Vicki Vale investigate sightings of "Batman", a masked vigilante targeting Gotham City's criminals. Both attend a fundraiser hosted by billionaire Bruce Wayne, who is secretly Batman, having chosen this path after witnessing a mugger murder his parents when he was a child. During the event, Wayne becomes infatuated with Vale. Meanwhile, mob boss Carl Grissom sends his sociopathic second-in-command Jack Napier to break into Axis Chemicals and retrieve incriminating evidence. However, this is secretly a ploy to have Napier murdered for carrying on an affair with Grissom's mistress, Alicia Hunt. Corrupt Gotham City police lieutenant Max Eckhardt arranges the hit on Napier by conducting an unauthorized raid on Axis Chemicals. However, Commissioner James Gordon learns of the raid and takes command, ordering the officers to capture Napier alive. Batman also appears, while Napier shoots and kills Eckhardt as revenge for the double-crossing. During a scuffle with Batman, Napier topples off a catwalk and falls into a vat of chemicals. Although presumed dead, Napier survives with various disfigurements including chalk white skin and emerald-green hair and nails. He undergoes surgery to repair the damage, but ends up with a rictus grin. Driven insane by his hideous appearance, Napier, now calling himself "the Joker", kills Grissom, massacres Grissom's associates, and takes over his operations. The Joker begins terrorizing Gotham by lacing various hygiene products with "Smylex" – a deadly chemical that causes victims to die laughing. Joker soon becomes obsessed with Vicki and lures her to the Flugelheim Museum, which his henchmen start vandalizing. Batman rescues Vicki, takes her to the Batcave, and provides her with all of his research on Smylex, which will allow Gotham's residents to escape the toxin. Conflicted with his love for her, Wayne visits her apartment intending to reveal his secret identity, only for the Joker to interrupt the meeting. Joker asks Wayne, "Have you ever danced with the devil in the pale moonlight?", which Wayne recognizes as the phrase used by the mugger who killed his parents, realizing the killer to have been Joker (as Napier) all along. He shoots Wayne, who survives thanks to a serving tray hidden underneath his shirt. Vicki is taken to the Batcave by Wayne's butler, Alfred Pennyworth, who had been coaxing the relationship between the pair. After Vicki learns his secret, Wayne chooses to battle the Joker for the sake of the city over their relationship. He then departs to destroy the Axis plant used to create Smylex. Meanwhile, Joker lures Gotham's citizens to a parade honoring Gotham's bicentennial with the promise of free money. This turns out to be a trap designed to dose them with Smylex gas held within giant parade balloons. Batman foils his plan by using his Batwing to remove the balloons, but Joker shoots him down. The Batwing crashes in front of a cathedral, which Joker uses to take Vicki hostage. Batman pursues the Joker, and in the ensuing fight, he explains that Napier killed his parents and thus, indirectly created Batman. Joker eventually pulls Batman and Vicki over the cathedral's roof, leaving them hanging while he calls in a helicopter. The helicopter is piloted by his goons, who throw down a ladder for him to climb. Batman uses a grappling hook to attach Joker's leg to a crumbling gargoyle that eventually falls off the roof. Unable to bear the statue's immense weight, Joker falls to his death while Batman and Vicki make it to safety. Sometime later, Gordon announces that the police have arrested all of Joker's men, effectively dismantled what remained of Carl Grissom's mafia organizations, and unveils the Bat-Signal. Batman leaves the police a note, promising to defend Gotham should crime strike again, and asking them to use the Bat-Signal to summon him in times of need. Alfred takes Vicki to Wayne Manor, explaining that Wayne will be a little late. She responds that she is not surprised, as Batman looks at the signal's projection from a rooftop, standing watch over the city. Production. Development. In the late 1970s, Batman's popularity was waning. CBS was interested in producing a "Batman in Outer Space" film. Producers Benjamin Melniker and Michael E. Uslan purchased the film rights of Batman from DC Comics on October 3, 1979. It was Uslan's wish "to make the definitive, dark, serious version of Batman, the way Bob Kane and Bill Finger had envisioned him in 1939. A creature of the night; stalking criminals in the shadows." Richard Maibaum was approached to write a script with Guy Hamilton to direct, but the two turned down the offer. Uslan was unsuccessful with pitching "Batman" to various movie studios because they wanted the film to be similar to the campy 1960s television series. Columbia Pictures and United Artists were among those to turn down the film. A disappointed Uslan then wrote a script titled "Return of the Batman" to give the film industry a better idea of his vision for the film. Uslan later compared its dark tone to that of the successful four-part comic book "The Dark Knight Returns", which his script predated by six years. In November 1979, producers Jon Peters and Peter Guber joined the project. Melniker and Uslan became executive producers. The four felt it was best to pattern the film's development after that of "Superman" (1978). Uslan, Melniker and Guber pitched "Batman" to Universal Pictures, but the studio turned it down. Though no movie studios were yet involved, the project was publicly announced with a budget of $15 million in July 1980 at the Comic Art Convention in New York. Warner Bros., the studio behind the successful Superman film franchise, decided to also accept and produce "Batman". Tom Mankiewicz completed a script titled "The Batman" in June 1983, focusing on Batman and Dick Grayson's origins, with the Joker and Rupert Thorne as villains and Silver St. Cloud as the romantic interest. Mankiewicz took inspiration from the limited series "Batman: Strange Apparitions", written by Steve Englehart. Comic book artist Marshall Rogers, who worked with Englehart on "Strange Apparitions", was hired for concept art. "The Batman" was then announced in late 1983 for a mid-1985 release date on a budget of $20 million. Originally, Mankiewicz had wanted an unknown actor for Batman, William Holden for James Gordon, David Niven as Alfred Pennyworth, and Peter O'Toole as the Penguin, whom Mankiewicz wanted to portray as a mobster with low body temperature. Holden died in 1981 and Niven in 1983, so this would never come to pass. A number of filmmakers were attached to Mankiewicz' script, including Ivan Reitman and Joe Dante. Reitman wanted to cast Bill Murray as Batman and Eddie Murphy as Robin. Nine rewrites were performed by nine separate writers. Most of them were based on "Strange Apparitions". However, it was Mankiewicz's script that was still being used to guide the project. Due to the work they did together with the film "Swamp Thing" (1982), Wes Craven was among the directors that Melniker and Uslan considered while looking for a director. After the financial success of "Pee-wee's Big Adventure" (1985), Warner Bros. hired Tim Burton to direct "Batman". Burton had then-girlfriend Julie Hickson write a new 30-page film treatment, feeling the previous script by Mankiewicz was campy. The success of "The Dark Knight Returns" and the graphic novel "" rekindled Warner Bros.' interest in a film adaptation. Burton was initially not a comic book fan, but he was impressed by the dark and serious tone found in both "The Dark Knight Returns" and "The Killing Joke". Warner Bros. enlisted the aid of Englehart to write a new treatment in March 1986. Like Mankiewicz's script, it was based on his own "Strange Apparitions" and included Silver St. Cloud, Dick Grayson, the Joker, and Rupert Thorne, as well as a cameo appearance by the Penguin. Warner Bros. was impressed, but Englehart felt there were too many characters. He removed the Penguin and Dick Grayson in his second treatment, finishing in May 1986. Burton approached Sam Hamm, a comic book fan, to write the screenplay. Hamm decided not to use an origin story, feeling that flashbacks would be more suitable and that "unlocking the mystery" would become part of the storyline. He reasoned, "You totally destroy your credibility if you show the literal process by which Bruce Wayne becomes Batman." Hamm replaced Silver St. Cloud with Vicki Vale and Rupert Thorne with his own creation, Carl Grissom. He completed his script in October 1986, which demoted Dick Grayson to a cameo rather than a supporting character. One scene in Hamm's script had a young James Gordon on duty the night of the murder of Bruce Wayne's parents. When Hamm's script was rewritten, the scene was deleted, reducing it to a photo in the "Gotham Globe" newspaper seen in the film. Warner Bros. was less willing to move forward on development, despite their enthusiasm for Hamm's script, which Kane greeted with positive feedback. Hamm's script was then bootlegged at various comic book stores in the United States. "Batman" was finally given the greenlight to commence pre-production in April 1988, after the success of Burton's "Beetlejuice" the same year. When comic book fans found out about Burton directing the film with Michael Keaton starring in the lead role, controversy arose over the tone and direction "Batman" was going in. Hamm explained, "They hear Tim Burton's name and they think of "Pee-wee's Big Adventure". They hear Keaton's name, and they think of any number of Michael Keaton comedies. You think of the 1960s version of "Batman", and it was the complete opposite of our film. We tried to market it with a typical dark and serious tone, but the fans didn't believe us." To combat negative reports on the film's production, Kane was hired as creative consultant. Batman's co-creator, Bill Finger, was uncredited at the time of the film's release and his name was not added to any Batman-related media until 2016. Casting. Parallel to the "Superman" casting, a variety of Hollywood A-listers were considered for the role of Batman, including Mel Gibson, Kevin Costner, Charlie Sheen, Tom Selleck, Bill Murray, Harrison Ford and Dennis Quaid. Burton was pressured by Warner Bros. to cast an obvious action movie star, and had approached Pierce Brosnan, but he had no interest in playing a comic book character. Burton was originally interested in casting an unknown actor, Willem Dafoe, who was falsely reported to be considered for the Joker but had actually been considered for Batman early in development. Producer Jon Peters suggested Michael Keaton, arguing he had the right "edgy, tormented quality" after having seen his dramatic performance in "Clean and Sober". Having directed Keaton in "Beetlejuice", Burton agreed. The casting of Keaton caused a furor among comic book fans, with 50,000 protest letters sent to Warner Bros. offices. Kane, Hamm, and Uslan also heavily questioned the casting. "Obviously there was a negative response from the comic book people. I think they thought we were going to make it like the 1960s TV series, and make it campy, because they thought of Michael Keaton from "Mr. Mom" and "Night Shift" and stuff like that." Keaton studied "The Dark Knight Returns" for inspiration. Tim Curry, David Bowie, John Lithgow, Brad Dourif, Ray Liotta, and James Woods were all considered for the Joker. Lithgow, during his audition, attempted to talk Burton out of casting him, a decision he would later publicly regret, stating, "I didn't realize it was such a big deal." Burton wanted to cast John Glover, but the studio insisted on using a movie star. Robin Williams lobbied hard for the part. Jack Nicholson had been the studio's top choice since 1980. Peters approached Nicholson as far back as 1986, during filming of "The Witches of Eastwick"; unlike Keaton, he was a popular choice for his role. Nicholson had what was known as an "off-the-clock" agreement. His contract specified the number of hours he was entitled to have off each day, from the time he left the set to the time he reported back for filming, as well as being off for Los Angeles Lakers home games. Nicholson demanded that all of his scenes be shot in a three-week block, but the schedule lapsed into 106 days. He reduced his standard $10 million fee to $6 million in exchange for a cut of the film's earnings (including associated merchandise), which led to remuneration in excess of $50 million—biographer Marc Eliot reports that Nicholson may have received as much as $90 million. He also demanded top billing on promotional materials. Sean Young was originally cast as Vicki Vale, but was injured in a horse-riding accident prior to commencement of filming. Young's departure necessitated an urgent search for an actress who, besides being right for the part, could commit to the film at very short notice. Peters suggested Kim Basinger: she was able to join the production immediately and was cast. As a fan of Michael Gough's work in various Hammer Film Productions, Burton cast Gough as Bruce Wayne's mysterious butler, Alfred. Reporter Alexander Knox was portrayed by Robert Wuhl. In the original script, Knox was killed by the Joker's poison gas during the climax, but the filmmakers "liked [my] character so much," Wuhl said, "that they decided to let me live." Burton chose Billy Dee Williams as Harvey Dent because he wanted to include the villain Two-Face in a future film using the concept of an African-American Two-Face for the black and white concept, but Tommy Lee Jones was later cast in the role for "Batman Forever", which disappointed Williams. Nicholson convinced the filmmakers to cast his close friend Tracey Walter as the Joker's henchman Bob. Irish child actor Ricky Addison Reed was cast as Dick Grayson before the character was removed by Warren Skarren for the revised shooting script. The rest of the cast included Pat Hingle as Commissioner Gordon, Jerry Hall as Alicia, Lee Wallace as Mayor Borg, William Hootkins as Lt. Eckhardt, and Jack Palance as crime boss Carl Grissom. Design. Burton had been impressed with the design of Neil Jordan's "The Company of Wolves" (1984), but was unable to hire its production designer Anton Furst for "Beetlejuice" as he had instead committed to Jordan's London-filmed ghost comedy "High Spirits" (1988), a choice he later regretted. A year later Burton successfully hired Furst for "Batman", and they enjoyed working with each other. "I don't think I've ever felt so naturally in tune with a director," Furst said. "Conceptually, spiritually, visually, or artistically. There was never any problem because we never fought over anything. Texture, attitude and feelings are what Burton is a master at." Furst and the art department deliberately mixed clashing architectural styles to "make Gotham City the ugliest and bleakest metropolis imaginable". Furst continued, "[W]e imagined what New York City might have become without a planning commission. A city run by crime, with a riot of architectural styles. An essay in ugliness. As if hell erupted through the pavement and kept on going". The 1985 film "Brazil" by Terry Gilliam was also a notable influence upon the film's production design, as both Burton and Furst studied it as a reference. Black and white charcoal drawings of key locations and sets were created by Furst's longtime draftsman, Nigel Phelps. Derek Meddings served as the visual effects supervisor, overseeing the miniatures and animation. Conceptual illustrator Julian Caldow designed the Batmobile, Batwing and assorted bat-gadgets that were later constructed by prop builder John Evans. Keith Short sculpted the final body of the 1989 Batmobile, adding two Browning machine guns. On designing the Batmobile, Furst explained, "We looked at jet aircraft components, we looked at war machines, we looked at all sorts of things. In the end, we went into pure expressionism, taking the Salt Flat Racers of the 30s and the Sting Ray macho machines of the 50s". The car was built upon a Chevrolet Impala when previous development with a Jaguar and Ford Mustang failed. The car itself was later purchased by standup comedian/ventriloquist Jeff Dunham, who had it outfitted with a Corvette engine to make it street legal. Costume designer Bob Ringwood turned down the chance to work on "Licence to Kill" in favor of "Batman". Ringwood found it difficult designing the Batsuit because "the image of Batman in the comics is this huge, big six-foot-four hunk with a dimpled chin. Michael Keaton is a guy with average build", he stated. "The problem was to make somebody who was average-sized and ordinary-looking into this bigger-than-life creature." Burton commented, "Michael is a bit claustrophobic, which made it worse for him. The costume put him in a dark, Batman-like mood though, so he was able to use it to his advantage". Burton's idea was to use an all-black suit, and was met with positive feedback by Bob Kane. Vin Burnham was tasked with sculpting the Batsuit, in association with Alli Eynon. Jon Peters wanted to use a Nike product placement with the Batsuit. Ringwood studied over 200 comic book issues for inspiration. 28 sculpted latex designs were created; 25 different cape looks and 6 different heads were made, accumulating a total cost of $250,000. Comic book fans initially expressed negative feedback against the Batsuit. Burton opted not to use tights, spandex, or underpants as seen in the comic book, feeling it was not intimidating. Prosthetic makeup designer Nick Dudman used acrylic-based makeup paint called PAX for Nicholson's chalk-white face. Part of Nicholson's contract was approval over the makeup designer. Filming. The filmmakers considered filming "Batman" entirely on the Warner Bros. backlot in Burbank, California, but media interest in the film made them change the location. It was shot at Pinewood Studios in England from October 10, 1988, to February 14, 1989, with 80 days of main shooting and 86 days of second unit shooting. 18 sound stages were used, with seven stages occupied, including the 51 acre backlot for the Gotham City set, one of the biggest ever built at the studio. Locations included Knebworth House and Hatfield House doubling for Wayne Manor, plus Acton Lane Power Station and Little Barford Power Station. The original production budget escalated from $30 million to $48 million. Filming was highly secretive. The unit publicist was offered and refused £10,000 for the first pictures of Jack Nicholson as the Joker. The police were later called in when two reels of footage (about 20 minutes' worth) were stolen. With various problems during filming, Burton called it "Torture. The worst period of my life!" Hamm was not allowed to perform rewrites during the 1988 Writers Guild of America strike. Warren Skaaren, who had also worked on Burton's "Beetlejuice", did rewrites. Jonathan Gems and Charles McKeown rewrote the script during filming. Only Skaaren received screenplay credit with Hamm. Hamm criticized the rewrites, but blamed the changes on Warner Bros. Burton explained, "I don't understand why that became such a problem. We started out with a script that everyone liked, although we recognized it needed a little work." Dick Grayson appeared in the shooting script but was deleted because the filmmakers felt he was irrelevant to the plot; Kane supported this decision. Keaton used his comedic experience for scenes such as Bruce and Vicki's Wayne Manor dinner. He called himself a "logic freak" and was concerned that Batman's secret identity would in reality be fairly easy to uncover. Keaton discussed ideas with Burton to better disguise the character, including the use of contact lenses. Ultimately, Keaton decided to perform Batman's voice at a lower register than when he was portraying Bruce Wayne, which became a hallmark of the film version of the character, with Christian Bale later using the same technique. Originally in the climax, the Joker was meant to kill Vicki Vale, sending Batman into a vengeful fury. Jon Peters reworked the climax without telling Burton and commissioned production designer Anton Furst to create a model of the cathedral. This cost $100,000 when the film was already well over budget. Burton disliked the idea, having no clue how the scene would end: "Here were Jack Nicholson and Kim Basinger walking up this cathedral, and halfway up Jack turns around and says, 'Why am I walking up all these stairs? Where am I going?' 'We'll talk about it when you get to the top!' I had to tell him that I didn't know." Music. Burton hired Danny Elfman of Oingo Boingo, his collaborator on "Pee-wee's Big Adventure" and "Beetlejuice", to compose the music score. For inspiration, Elfman was given "The Dark Knight Returns". Elfman was worried, as he had never worked on a production this large in budget and scale. In addition, producer Jon Peters was skeptical of hiring Elfman, but was later convinced when he heard the opening number. Peters and Peter Guber wanted Prince to write music for the Joker and Michael Jackson to do the romance songs. Elfman would then combine the style of Prince and Jackson's songs together for the entire film score. At the encouragement of Prince's then-manager Albert Magnoli, it was agreed that Prince himself would write and sing the film's songs. Burton protested the ideas, citing "my movies aren't commercial like "Top Gun"." Elfman enlisted the help of composer Shirley Walker and Oingo Boingo lead guitarist Steve Bartek to arrange the compositions for the orchestra. Elfman was later displeased with the audio mixing of his film score. ""Batman" was done in England by technicians who didn't care, and the non-caring showed," he stated. "I'm not putting down England because they've done gorgeous dubs there, but this particular crew elected not to." "Batman" was one of the first films to spawn two soundtracks. One of them featured songs written by Prince while the other showcased Elfman's score. Both were successful, and compilations of Elfman's opening credits were used in the title sequence theme for "", also composed by Shirley Walker. Themes. When discussing the central theme of "Batman", director Tim Burton explained, "the whole film and mythology of the character is a complete duel of the freaks. It's a fight between two disturbed people", adding, "The Joker is such a great character because there's a complete freedom to him. Any character who operates on the outside of society and is deemed a freak and an outcast then has the freedom to do what they want... They are the darker sides of freedom. Insanity is in some scary way the most freedom you can have, because you're not bound by the laws of society". Burton saw Bruce Wayne as the bearer of a double identity, exposing one while hiding the reality from the world. Burton biographer Ken Hanke wrote that Bruce Wayne, struggling with his alter-ego as Batman, is depicted as an antihero. Hanke felt that Batman has to push the boundaries of civil justice to deal with certain criminals, such as the Joker. Kim Newman theorized that "Burton and the writers saw Batman and the Joker as a dramatic antithesis, and the film deals with their intertwined origins and fates to an even greater extent". "Batman" conveys trademarks found in 1930s pulp magazines, notably the design of Gotham City stylized with Art Deco design. Richard Corliss, writing for "Time", observed that Gotham's design was a reference to films such as "The Cabinet of Dr. Caligari" (1920) and "Metropolis" (1927). "Gotham City, despite being shot on a studio backlot", he continued, "is literally another character in the script. It has the demeaning presence of German Expressionism and fascist architecture, staring down at the citizens." Hanke further addressed the notions of "Batman" being a period piece, in that "The citizens, cops, people and the black-and-white television looks like it takes place in 1939"; but later said: "Had the filmmakers made Vicki Vale a femme fatale rather than a damsel in distress, this could have made "Batman" as a homage and tribute to classic film noir." Portions of the climax pay homage to "Vertigo" (1958). Marketing. The B.D. Fox ad agency created hundreds of unused logos and posters for promotion, many by John Alvin. In the end Burton and producers decided on only using a gold and black logo designed by Anton Furst and airbrushed by Bill Garland, with no other key art variation, to keep an air of mystery about the film. The logo is also an ambiguous image, which can be read either as Batman's symbol or as a gaping mouth. Earlier designs "had the word 'Batman' spelled in "RoboCop" or "Conan the Barbarian"-type font". Jon Peters unified all the film's tie-ins, even turning down $6 million from General Motors to build the Batmobile because the car company would not relinquish creative control. During production, Peters read in "The Wall Street Journal" that comic book fans were unsatisfied with the casting of Michael Keaton. In response, Peters rushed the first film trailer that played in thousands of theaters during Christmas. It was simply an assemblage of scenes without music, but created enormous anticipation for the film, with audiences clapping and cheering. DC Comics allowed screenwriter Sam Hamm to write his own comic book miniseries. Hamm's stories were collected in the graphic novel "Batman: Blind Justice" (). Denys Cowan and Dick Giordano illustrated the artwork. "Blind Justice" tells the story of Bruce Wayne trying to solve a series of murders connected to Wayne Enterprises. It also marks the first appearance of Henri Ducard, who was later used in the rebooted "Batman Begins", albeit as an alias for the more notable Ra's al Ghul. In the months before "Batman"s release in June 1989, a popular culture phenomenon known as "Batmania" began. Over $750 million worth of merchandise was sold. Cult filmmaker and comic book writer Kevin Smith remembered: "That summer was huge. You couldn't turn around without seeing the Bat-Signal somewhere. People were cutting it into their fucking heads. It was just the summer of Batman and if you were a comic book fan it was pretty hot." Hachette Book Group USA published a novelization, "Batman", written by Craig Shaw Gardner. It remained on "The New York Times" Best Seller list throughout June 1989. Burton admitted he was annoyed by the publicity. David Handelman of "The New York Observer" categorized "Batman" as a high concept film. He believed "it is less movie than a corporate behemoth". Reception. Box office. "Batman" grossed $2.2 million in late night previews on June 22, 1989, on 1,215 screens and grossed $40.49 million in 2,194 theaters during its opening weekend. This broke the opening weekend records held by "Indiana Jones and the Last Crusade" (which had a 4-day Memorial Day weekend gross of $37.0 million the previous month) and "Ghostbusters II" (which had a $29.4 million 3-day weekend the previous weekend). Upon opening, the film would go on to reach the number one spot above "Honey, I Shrunk the Kids". Additionally, it had the largest opening weekend for a Jack Nicholson film for 14 years until it was dethroned by "Anger Management" in 2003. "Batman" also set a record for a second weekend gross with $30 million (also the second biggest 3-day weekend of all time) and became the fastest film to earn $100 million, reaching it in 11 days (10 days plus late night previews). The film closed on December 14, 1989, with a final gross of $251.4 million in North America and $160.2 million internationally, totaling $411.6 million. The film would hold the record for being the highest-grossing Warner Bros. film until 1996 when "Twister" surpassed it. It was the highest-grossing film based on a DC comic book until 2008's "The Dark Knight". Furthermore, "Batman" held the record for being the highest-grossing superhero film of all time until it was taken by "Spider-Man" in 2002. The film's gross is the 143rd highest ever in North American ranks. Although "Indiana Jones and the Last Crusade" made the most money worldwide in 1989, "Batman" was able to beat "The Last Crusade" in North America, and made a further $150 million in home video sales. Box Office Mojo estimates that the film sold more than 60 million tickets in the US. Despite the film's box office – over $400 million against a budget of no more than $48 million – Warner Bros. claimed it ended up losing $35.8 million and "not likely to ever show a profit," which has been attributed to a case of Hollywood accounting. Critical response. "Batman" was criticized by some for being too dark, but nonetheless received a generally positive response from critics. On review aggregator Rotten Tomatoes, the film holds an approval rating of 77% based on 142 reviews, with an average score of 7.1/10. The website's critical consensus reads, "An eerie, haunting spectacle, "Batman" succeeds as dark entertainment, even if Jack Nicholson's Joker too often overshadows the title character." On Metacritic, the film received a weighted average score of 69 based on 21 reviews, indicating "generally favorable" reviews. Audiences polled by CinemaScore gave the film an average grade of "A" on an A+ to F scale. Many observed that Burton was more interested in the Joker and the art and set production design than Batman or anything else in terms of characterization and screentime. Comic book fans reacted negatively over the Joker murdering Thomas and Martha Wayne; in the comic book, Joe Chill is responsible. Writer Sam Hamm said it was Burton's idea to have the Joker murder Wayne's parents. "The Writer's Strike was going on, and Tim had the other writers do that. I also hold innocent to Alfred letting Vicki Vale into the Batcave. Fans were ticked off with that, and I agree. That would have been Alfred's last day of employment at Wayne Manor," Hamm said. The songs written by Prince were criticized for being "too out of place". While Burton has stated he had no problem with the Prince songs, he was less enthusiastic with their use in the film. On the film, Burton remarked, "I liked parts of it, but the whole movie is mainly boring to me. It's OK, but it was more of a cultural phenomenon than a great movie." Despite initial negative reactions from comics fans prior to the film's release, Keaton's portrayal of Batman was generally praised. James Berardinelli called the film entertaining, with the highlight being the production design. However, he concluded, "the best thing that can be said about "Batman" is that it led to "Batman Returns", which was a far superior effort." "Variety" felt "Jack Nicholson stole every scene" but still greeted the film with positive feedback. Roger Ebert was highly impressed with the production design, but claimed ""Batman" is a triumph of design over story, style over substance, a great-looking movie with a plot you can't care much about." He also called the film "a depressing experience". On the syndicated television series "Siskel & Ebert", his reviewing partner Gene Siskel disagreed, describing the film as having a "refreshingly adult" approach with performances, direction and set design that "draws you into a psychological world". Legacy. Anton Furst and Peter Young won the Academy Award for Best Art Direction, while Nicholson was nominated for the Golden Globe Award for Best Actor (Musical or Comedy). The British Academy of Film and Television Arts nominated "Batman" in six categories (Production Design, Visual Effects, Costume Design, Makeup, Sound and Actor in a Supporting Role for Nicholson), but it won none of the categories. Nicholson, Basinger, the makeup department, and costume designer Bob Ringwood all received nominations at the Saturn Awards. The film was also nominated for the Saturn Award for Best Fantasy Film and the Hugo Award for Best Dramatic Presentation. The success of "Batman" prompted Warner Bros. Animation to create the acclaimed "", as a result beginning the long-running DC Animated Universe and helped establish the modern day superhero film genre. Series co-creator Bruce Timm stated the television show's Art Deco design was inspired from the film. Timm commented, "our show would never have gotten made if it hadn't been for that first "Batman" movie." Burton joked, "ever since I did "Batman", it was like the first dark comic book movie. Now everyone wants to do a dark and serious superhero movie. I guess I'm the one responsible for that trend." "Batman" initiated the original "Batman" film series and spawned three sequels: "Batman Returns" (1992), "Batman Forever" (1995), and "Batman & Robin" (1997), the latter two of which were directed by Joel Schumacher instead of Burton and replaced Keaton as Batman with Val Kilmer and George Clooney, respectively. Executive producers Benjamin Melniker and Michael E. Uslan filed a breach of contract lawsuit in Los Angeles County Superior Court on March 26, 1992. Melniker and Uslan claimed to be "the victims of a sinister campaign of fraud and coercion that has cheated them out of continuing involvement in the production of "Batman" and its sequels. We were denied proper credits, and deprived of any financial rewards for our indispensable creative contribution to the success of "Batman"." A superior court judge rejected the lawsuit. Total revenues of "Batman" have topped $2 billion, with Uslan claiming to have "not seen a penny more than that since our net profit participation has proved worthless." Warner Bros. offered the pair an out-of-court settlement, a sum described by Melniker and Uslan's attorney as "two popcorns and two Cokes". Reflecting on the twentieth anniversary of its release in a retrospective article on Salon.com, film commentator Scott Mendelson noted the continuing impact that "Batman" has had on the motion film industry, including the increasing importance of opening weekend box office receipts; the narrowing window between a film's debut and its video release that caused the demise of second-run movie theaters; the accelerated acquisition of pre-existing, pre-sold properties for film adaptations that can be readily leveraged for merchandizing tie-ins; the primacy of the MPAA PG-13 as the target rating for film producers; and more off-beat, non-traditional casting opportunities for genre films. The film was responsible for the British Board of Film Classification introducing its "12" age rating, as its content fell between what was expected for a "PG" or "15" certificate. The American Film Institute anointed Batman the 46th greatest movie hero and the Joker the 45th greatest movie villain on "AFI's 100 Years...100 Heroes and Villains". Robert Wuhl reprises his role as Alexander Knox in The CW's Arrowverse crossover, "Crisis on Infinite Earths". The event also retroactively established that the world of the film and its sequel, "Batman Returns", takes place on Earth-89; which is one of the worlds destroyed by the Anti-Monitor (LaMonica Garrett) during the Crisis. Michael Keaton reprises his role as Batman in "The Flash" set in the DC Extended Universe. Video games. Several video games based on the film were released: By Ocean Software in 1989, by Sunsoft in and 1990, and by Atari Games in 1991. Konami was also in talks of releasing an arcade game around the same time as Atari. Comic book continuations. In March 2016, artist Joe Quinones revealed several art designs he and Kate Leth had created to pitch a comic book continuation set in the Batman '89 universe to DC Comics. The pitch, which was rejected, would have included the story of Billy Dee Williams' Harvey Dent turning into Two-Face as well as the inclusion of characters such as Batgirl in a story that took place following the events of "Batman Returns". In 2021, DC announced it would be releasing a comic book continuation of the "Batman '89" film. The series would be written by Sam Hamm and illustrated by Joe Quinones. The comic's synopsis revealed that it would include the return of Selina Kyle/Catwoman, an introduction of a new Robin, and the transformation of Williams' Harvey Dent into Two-Face. A followup series was later announced by DC Comics on August 17, 2023. The first issue of the new series was released on November 28, 2023. It was written again by Sam Hamm, with art by Joe Quinones. In the series, Batman has mysteriously disappeared after Dent's death, leading Gotham citizens to take to the streets to fight in his place, including Barbara Gordon, who becomes Batgirl. Scarecrow and Harley Quinn will be featured as the main antagonists, seemingly referencing the unproduced fifth film in the Burton and Schumacher series, "Batman Unchained". Direct sequel novels. On April 11, 2024, it was announced that a new novel would be released which would tie-in to the film. Announced with the title "", the novel was written by author John Jackson Miller, and acts as a direct sequel to the film, being set between the events of "Batman" and its sequel "Batman Returns", with Batman focusing on dismantling the remnants of Joker's organization, while contemplating on the idea that Joker might not actually be dead. The novel also includes certain characters introduced in the sequel, with one example being Max Shreck. It was released on October 15, 2024, by Penguin Random House; a sequel, titled "Batman: Revolution", was later revealed by Miller and is scheduled for release in fall 2025. Home media. "Batman" has been released on various formats, including VHS, LaserDisc, DVD and Blu-ray. In an unprecedented move at the time, it was made available to buy on VHS in the United States on November 15, 1989, less than six months after its theatrical release, at a suggested retail price of only $24.95 although most sellers sold it for less. It was first released on DVD on March 25, 1997, as a double sided disc containing both Widescreen (1.85:1) and Full Screen (1.33:1) versions of the film. The 2005 "Batman: The Motion Picture Anthology 1989–1997" included 2-disc special edition DVDs of the film and all three of its sequels. The anthology was also released as a 4-disc Blu-ray set in 2009, with each film and its previous extras contained on a single disc. Other Blu-ray reissues include a "30th Anniversary" Digibook with 50-page booklet, and a steelcase edition; both also include a Digital Copy. Most recently the "25th Anniversary" Diamond Luxe reissue contained the same disc as before and on a second disc, a new 25-minute featurette: "Batman: The Birth of the Modern Blockbuster". The film was also included in "The Tim Burton Collection" DVD and Blu-ray set in 2012, alongside its first sequel, "Batman Returns". "Batman" was released on Ultra HD Blu-ray on June 4, 2019.
4727
48523215
https://en.wikipedia.org/wiki?curid=4727
Batman (1966 film)
Batman (also known as Batman: The Movie) is a 1966 American superhero film directed by Leslie H. Martinson. Based on the television series, and the first full-length theatrical adaptation of the DC Comics character of the same name, the film stars Adam West as Batman and Burt Ward as Robin. The film hit theaters two months after the last episode of the first season of the television series. The film includes most members of the original TV cast, with the exception of Julie Newmar as Catwoman, who, in the film, was replaced by Lee Meriwether. Plot. When Batman and Robin get a tip that Commodore Schmidlapp, owner of the Big Ben Distillery, is in danger aboard his yacht, they launch a rescue mission using the Batcopter. As Batman descends on the bat-ladder to land on the yacht, it suddenly vanishes beneath him. He rises out of the sea with a shark attacking his leg. After Batman dislodges it with bat-shark repellent, the shark explodes. Batman and Robin head back to Commissioner Gordon's office, where they deduce that the tip was a set-up by the United Underworld, a gathering of four of the most powerful villains in Gotham City: the Joker, the Penguin, the Riddler, and the Catwoman. The four criminals equip themselves with a dehydrator that can turn humans into dust until they are rehydrated (an invention of Schmidlapp's, who is unaware that he has been kidnapped), escape in a war-surplus, pre-atomic submarine made to resemble a penguin, and recruit three pirate-themed henchmen (Bluebeard, Morgan and Quetch). Batman and Robin learn that the yacht was really a holographic projection and return via Batboat to a buoy concealing a projector, where they are trapped on the buoy by a magnet and targeted by torpedoes. They use a radio-detonator to destroy two of the torpedoes, and a porpoise sacrifices itself to intercept the last one. Catwoman, disguised as Soviet journalist "Kitayna Ireyna Tatanya Kerenska Alisoff" (acronymed as Kitka) of the Moscow "Bugle", helps the group kidnap Bruce Wayne and pretends to be kidnapped with him, as part of a plot to lure Batman and finish him off with another of Penguin's explosive animals (of course unaware that Bruce Wayne is Batman's alter-ego). After Bruce Wayne fights his way out of captivity, he again disguises himself as Batman, and the Dynamic Duo returns to the United Underworld's HQ, only to find a smoking bomb. Batman is met with frustration rushing all over the docks unable to locate a safe place to dispose of the bomb but finally does so in the nick of time. The Penguin disguises himself as the Commodore and schemes his way into the Batcave along with five dehydrated henchmen. This plan fails when the henchmen unexpectedly disappear into antimatter once struck: the Penguin mistakenly rehydrated them with toxic heavy water used to recharge the Batcave's atomic pile, leaving them highly unstable. Ultimately, Batman and Robin are unable to prevent the kidnapping of the dehydrated United World Organization's Security Council, consisting of ambassadors from Japan, the U.S, the U.S.S.R., Israel, France, Spain, West Germany, the United Kingdom, and Nigeria. Giving chase in the Batboat to retrieve them (and Miss Kitka, presumed by the duo as still captive), Robin uses a sonic charge weapon to disable The Penguin's submarine and force it to surface, where a fist fight ensues. Although Batman and Robin come out on top, Batman is heartbroken to discover that his "true love" Miss Kitka is actually Catwoman when she loses her mask. Commodore Schmidlapp accidentally breaks the powdered Council members' vials and sneezes on them, scattering the dust. Batman sets to work, constructing an elaborate Super Molecular Dust Separator to filter the mingled dust. Robin asks him whether it might be in the world's best interests for them to alter the dust samples so that humans can no longer harm one another. In response, Batman says that they cannot do so, reminding Robin of the fate of the Penguin's henchmen and their tainted rehydration, and can only hope for people, in general, to learn to live together peacefully on their own. With the world watching, the Security Council is re-hydrated. All members are restored alive and well, but continue to squabble amongst themselves, totally oblivious of their surroundings; however, each of them now speaks the language and displays the stereotypical mannerisms of a nation other than their own. Batman quietly expresses his sincere hope to Robin that this "strange mixing of minds" does more good than harm. The duo quietly leaves United World Headquarters by climbing out of the window and descending on their bat-ropes. Production. Development. William Dozier wanted to make a big-screen film to generate interest in his proposed "Batman" television series by having the feature in theaters while the first season of the series was rolling before the cameras. The studio, 20th Century-Fox, refused because it would have to cover the entire cost of a movie, while it would only have to share the cost of a TV series (a much less risky proposition). The studio acquiesced after a 1965 screening of Columbia Pictures' 1943 "The Batman" serial in New York City renewed interest in the character and after the television series became phenomenally successful. The project was announced in a March 26, 1966, issue of "Variety" magazine. The film features many characters from the show. It was written by series writer Lorenzo Semple Jr. and directed by Leslie H. Martinson, who had directed a pair of the television series season one episodes: "The Penguin Goes Straight" and "Not Yet, He Ain't". Semple Jr. completed the screenplay in 10 days. Principal photography began on April 28, 1966, and concluded within 28 days, with a further three days to complete second-unit photography. Casting. The film includes most members of the original TV cast: the actors for Batman, Robin, Alfred, Gordon, O'Hara, Aunt Harriet, the Joker, the Penguin, and the Riddler all reprised their roles. Though Julie Newmar had at this point played Catwoman in two episodes of in the TV series, she had other commitments at that time and was replaced by Lee Meriwether in the film. According to the "Biography" special "Catwoman: Her Many Lives" aired on July 20, 2004, Newmar was unable to reprise her role because of a back injury. Catwoman was nonetheless played by Newmar once again in the following eleven episodes of of the series; Eartha Kitt would then play Catwoman in three episodes of . Jack LaLanne has a cameo as a man on a rooftop with bikini-clad women. Tone and themes. Even though it is often described (like many contemporary shows) as a parody of a popular comic-book character, some commentators believe that its comedy is not so tightly confined. They felt the film's depiction of the Caped Crusader "captured the feel of the contemporary comics perfectly". The film was made at a time when "the Batman of the Golden Age comics was already essentially neutered". Certain elements verge into direct parody of the history of Batman. The movie, like the TV series, is strongly influenced by the comparatively obscure 1940s serials of Batman, such as the escapes done almost out of luck. The penchant for giving devices a "Bat-" prefix and the dramatic use of stylized title cards during fight scenes acknowledge some of the conventions that the character had accumulated in various media, but the majority of "Batman"'s campier moments can be read as a broader parody on contemporary mid-1960s culture in general. Furthermore, the movie represented Batman's first major foray into Cold War issues, paying heavy attention to Polaris Missiles, war surplus submarines and taking a poke at the Pentagon. The inclusion of a glory-hunting presidential character and the unfavorable portrayal of Security Council Members marked Batman's first attempts to poke fun at domestic and international politics. Vehicles. Besides the Batmobile, other vehicles used by The Dynamic Duo include: Of the three new Batvehicles which first appeared in the "Batman" film, only the Batcycle properly crossed over into the TV series as the budgetary limits of the TV series precluded the full use of the others. While the Batcopter and Batboat from the movie appeared briefly in episodes (including a use of the Batboat in the conclusion of the first post-film two-parter: "Walk the Straight and Narrow"), they primarily did so in the form of stock-footage scenes from the film intercut into the series. Music. Nelson Riddle's original score to "Batman the Movie" was released in 2010 by La-La Land Records and Fox Music. The album contains the entire score as heard in the film in chronological order as well as an unreleased cue. This limited edition includes a lavishly illustrated color booklet which features exclusive liner notes by Brian Baterwhite. This Limited Edition was of 2000 units. It was newly re-issued in 2016. While the program and master of this release is identical to the 2010 release, this reissue features all-new exclusive liner notes by John Takis and art design by Jim Titus. This new Limited Edition is of 2500 units. Release. Theatrical. "Batman" premiered at the Paramount Theatre in Austin, Texas, on July 30, 1966 (between the first and second seasons of the TV series); it was moderately successful at the box office. The Batboat featured in the film was created by Austin-based company Glastron, whose payment was in having the film premiere in their hometown. In conjunction with the premiere, Jean Boone of Austin CBS affiliate station KTBC interviewed the film's cast, including Lee Meriwether, Cesar Romero, and Adam West. Television. ABC, the network which previously aired the "Batman" television series, first broadcast the film on the July 4, 1971, edition of "The ABC Sunday Night Movie"; the film was quickly rebroadcast on ABC September 4 of that year. Home media. The film debuted on home video via formats VHS and Betamax release in 1985 by Playhouse Video, then apparently re-released (after a Blockbuster Video executive asked why it "wouldn't" be repackaged on the heels of the successful "Batman" film of 1989) in 1989 by CBS/Fox Video, and in 1994 by Fox Video. The film was released on DVD in 2001, and re-released July 1, 2008, on DVD and on Blu-ray by 20th Century Fox Home Entertainment. Reception. Box office. According to Fox records, the film needed to earn $3.2 million in rentals to break even and made $3.9 million (equivalent to $ million in ). Critical response. "Batman" holds an 81% rating on Rotten Tomatoes based on thirty-six reviews. The site's consensus states: ""Batman: The Movie" elevates camp to an art form -- and has a blast doing it, every gloriously tongue-in-cheek inch of the way." Bill Gibron of Filmcritic.com gave the film 3 out of 5 stars: "Unlike other attempts at bringing these characters to life...the TV cast really captures the inherent insanity of the roles". "Variety" stated in their review that "the intense innocent enthusiasm of Cesar Romero, Burgess Meredith and Frank Gorshin as the three criminals is balanced against the innocent calm of Adam West and Burt Ward, Batman and Robin respectively". Sequels. West and Ward reprised their roles in "Batman" animated movies for the show's 50th anniversary along with Julie Newmar returning. "Batman: Return of the Caped Crusaders" was released on Digital HD and digital media on October 11, 2016, and on DVD and Blu-ray November 1. A sequel to "Batman: Return of the Caped Crusaders" called "Batman vs. Two-Face" was released on October 10, 2017. The film starred William Shatner voicing Two-Face as the main antagonist. Adam West completed his voiceover work, but died of leukemia in June 2017, before it was released.
4728
39228190
https://en.wikipedia.org/wiki?curid=4728
Batman Returns
Batman Returns is a 1992 American superhero film directed by Tim Burton and written by Daniel Waters. Based on the DC Comics character Batman, it is the sequel to "Batman" (1989) and the second installment in the 1989–1997 "Batman" series. In the film, Batman comes into conflict with wealthy industrialist Max Shreck and malformed crime boss Oswald Cobblepot / The Penguin, who seek power regardless of the cost to Gotham City. Their plans are complicated by Shreck's former secretary, Selina Kyle, who seeks revenge against him as Catwoman. The cast includes Michael Keaton, Danny DeVito, Michelle Pfeiffer, Christopher Walken, Michael Gough, Pat Hingle, and Michael Murphy. Burton had no interest in making a sequel to "Batman", believing that he was creatively restricted by the expectations of Warner Bros. He agreed to return in exchange for creative control, including replacing original writer Sam Hamm with Daniel Waters, and hiring many of his previous creative collaborators. Waters's script focused on characterization over an overarching plot, and Wesley Strick was hired to complete an uncredited re-write which, among other elements, provided a master plan for the Penguin. Filming took place between September 1991 and February 1992, on a $50–80million budget, on sets and sound stages at Warner Bros. Studios and the Universal Studios Lot in California. Special effects primarily involved practical applications and makeup, with some animatronics and computer-generated imagery. The film's marketing campaign was substantial, including brand collaborations and a variety of merchandise, aiming to repeat "Batman" financial success. Released on June 19, 1992, "Batman Returns" broke several box-office records and earned $266.8million worldwide. However, it failed to replicate the success or longevity of "Batman" ($411.6million); this was blamed on the darker tone as well as violent and sexual elements, which alienated family audiences and led to a backlash against marketing partners for promoting the film to young children. Reviews were polarized about the film, but praised most of the main cast. After the relative failure of "Batman Returns", Burton was replaced as director of the third film, "Batman Forever" (1995), with Joel Schumacher to take the series in a family-friendly direction. Keaton chose not to reprise his role, disagreeing with Schumacher's vision. "Batman Forever" and its sequel, "Batman & Robin" (1997), were financial successes but fared less well critically. "Batman Returns" has been reassessed as one of the best "Batman" films in the decades since its release, and its incarnations of Catwoman and Penguin are considered iconic. A comic book, "Batman '89" (2021), continued the narrative of the original two Burton films, and Keaton reprised his version of Batman in "The Flash" (2023). Plot. In Gotham City, two wealthy socialites, dismayed at the birth of their malformed and feral son Oswald, discard the infant in the sewers, where he is adopted by a family of penguins. Thirty-three years later, during the Christmas season, wealthy industrialist Max Shreck is abducted by the Red Triangle gang—a group of former circus workers connected to child disappearances across the country—and brought to their hideout in the Arctic exhibit at the derelict Gotham Zoo. Their leader, Oswald—now named the Penguin—blackmails Shreck with evidence of his corruption and murderous acts to compel his assistance in reintegrating Oswald into Gotham's elite. Shreck orchestrates a staged kidnapping of the mayor's infant child, allowing Oswald to rescue it and become a public hero. In exchange, Oswald requests access to the city's birth records, ostensibly to learn his true identity by researching Gotham's first-born sons. Shreck attempts to murder his timid secretary, Selina Kyle, by pushing her out of a window after she accidentally uncovers his scheme to construct a power plant that would secretly drain and store Gotham's electricity. Selina survives, returns home, angrily crafts a costume and adopts the name Catwoman. To Shreck's surprise, Selina returns to work with newfound confidence and assertiveness, immediately capturing the attention of visiting billionaire Bruce Wayne. As his alter ego, the vigilante Batman, Wayne investigates Oswald, suspecting a connection to the Red Triangle gang. To remove obstacles to his power plant, Shreck persuades Oswald to run for mayor and undermine the incumbent by unleashing Red Triangle on Gotham. Batman's attempts to stop the chaos lead to a confrontation with Catwoman. Meanwhile, Selina and Wayne start dating, while Catwoman teams up with Oswald to tarnish Batman's reputation. During Gotham's Christmas-tree lighting, Oswald and Catwoman kidnap Gotham's beauty queen, the Ice Princess, and lure Batman to a rooftop above the ceremony. Oswald pushes the Ice Princess to her death with a swarm of bats, framing Batman. When Catwoman objects to the murder and rebuffs Oswald's sexual advances, he attacks her, sending her crashing through a glasshouse. Batman escapes in the Batmobile, unaware that the Red Triangle gang has sabotaged it, allowing Oswald to take it on a remote-controlled rampage. Before regaining control, Batman records Oswald's insulting tirade against Gotham's citizens and plays it during Oswald's mayoral rally, destroying his public image and forcing him to retreat to Gotham Zoo. There, Oswald renounces his humanity, fully embracing his identity as the Penguin, and sets his plan in motion to abduct and kill Gotham's first-born sons as revenge for his own abandonment. Selina attempts to kill Shreck at his charity ball, but Wayne intervenes, and they accidentally discover each other's secret identities. Penguin crashes the event, intending to kidnap Shreck's son, Chip, but Shreck offers himself instead. Batman disrupts the Red Triangle gang and halts the kidnappings, forcing the Penguin to unleash his missile-equipped penguin army to destroy Gotham. However, Batman's butler, Alfred Pennyworth, overrides the control signal, redirecting the penguins back to Gotham Zoo. As the missiles obliterate the zoo, Batman unleashes a swarm of bats, causing the Penguin to fall into the contaminated waters of the Arctic exhibit. Catwoman arrives to confront Shreck, rejecting Batman's plea to abandon her quest for vengeance and leave with him. Shreck shoots her four times, but she seems unaffected, claiming she has two of her nine lives left. Catwoman electrocutes Shreck, causing a power surge that appears to kill them both; however, Batman finds only Shreck's charred remains. The Penguin emerges one last time but succumbs to his injuries before he can attack Batman. His penguins lay him to rest in the water. Sometime later, as Alfred drives Wayne home, he spots Selina's silhouette but finds only a cat, which he takes with him. The Bat-Signal shines above the city as Catwoman gazes up at it. Cast. The cast of "Batman Returns" includes Andrew Bryniarski as Max's son Charles "Chip" Schreck and Cristi Conaway as the Ice Princess, Gotham's beauty queen-elect. Paul Reubens and Diane Salinger appear as Tucker and Esther Cobblepot, Oswald's wealthy, elite parents. Sean Whalen appears as a paperboy; Jan Hooks and Steve Witting play Jen and Josh, Oswald's mayoral image consultants. The Red Triangle gang includes the monkey-toting Organ Grinder (Vincent Schiavelli), the Poodle Lady (Anna Katarina), the Tattooed Strongman (Rick Zumwalt), the Sword Swallower (John Strong), the Knifethrower Dame (Erika Andersch), the Acrobatic Thug (Gregory Scott Cummins), the Terrifying Clown (Branscombe Richmond), the Fat Clown (Travis Mckenna), and the Thin Clown (Doug Jones). Production. Development. Following the success of "Batman" (1989), which became the fifth-highest-grossing film of its time, a sequel was considered inevitable. Warner Bros. Pictures was confident in its potential, with discussions about a follow-up beginning by late 1989 and plans to start filming in May of the next year. The studio wanted Robin Williams and Danny DeVito to play the rogues Riddler and Penguin, respectively, and had also invested $2million in acquiring the Gotham City sets at Pinewood Studios in England, intending to use them for at least two more sequels. These sets were kept under 24-hour surveillance as it was more cost-effective to maintain them than to build new ones. Despite pressure from Warner Bros. to finalize a script and begin production, director Tim Burton was hesitant about returning for a sequel. He called the idea "dumbfounded," particularly before analyzing the performance of the first film. Burton was generally skeptical of sequels, believing they were only worthwhile if they offered a chance to explore something new and different. "Batman" writer Sam Hamm's initial story idea expanded the character of district attorney Harvey Dent, played in "Batman" by Billy Dee Williams, and his descent into the supervillain Two-Face. Warner Bros. wanted the main villain to be the Penguin, however, whom Hamm believed the studio saw as Batman's most prominent enemy after the Joker. Catwoman was added because Burton and Hamm were interested in the character. Hamm's drafts continued directly from "Batman", focusing on the relationship between Wayne and Vicki Vale (Kim Basinger) and their engagement. The Penguin was written as an avian-themed criminal who uses birds as weapons; Catwoman was more overtly sexualised, wore "bondage" gear, and nonchalantly murdered groups of men. The main narrative teamed Penguin and Catwoman to frame Batman for the murders of Gotham's wealthiest citizens in their pursuit of a secret treasure. Their quest leads them to Wayne Manor, and reveals the Waynes's secret history. Among other things, Hamm originated the Christmastime setting and introduced Robin, Batman's sidekick, although his idea for assault rifle-wielding Santas was abandoned. Hamm ensured that Batman did not kill anyone and focused on protecting Gotham's homeless. The two drafts produced by Hamm failed to renew Burton's interest, and Burton concentrated on directing "Edward Scissorhands" (1990) and writing "The Nightmare Before Christmas" (1993) instead. Burton was confirmed to direct the sequel in January 1991, with filming scheduled to begin later that year for a 1992 release date. He agreed to return in exchange for creative control on the sequel; Burton considered "Batman" the least favorite of his films, describing it as occasionally boring. According to Denise Di Novi, his long-time producer, "Only about 50% of "Batman" was [Burton]"; the studio wanted "Batman Returns" to be "more of a Tim Burton movie... [a] weirder movie but also more hip and fun." Burton replaced key "Batman" crew with some of his former collaborators, including cinematographer Stefan Czapsky, production designer Bo Welch, creature-effects supervisor Stan Winston, makeup artist Ve Neill, and art directors Tom Duffield and Rick Henrichs. Daniel Waters was hired to replace Hamm because Burton wanted someone with no emotional attachment to "Batman" and liked Waters's script for the dark comedy "Heathers" (1988), which matched Burton's intended tone and creative direction. Burton reportedly disliked "Batman" producer Jon Peters, demoted him to executive producer of "Batman Returns", and effectively barred him from the set. Warner Bros. was the production company and distributor, with production assistance from executive producer Peter Guber's and Peters's Polygram Pictures. Writing. Waters began writing his first draft in mid-1990. Burton's only instructions were that the script have no connection to "Batman", outside of a singular reference to Vale as Wayne's ex-girlfriend, and that Catwoman have a greater characterization than sexy vixen. Waters did not like the 1989 film, and had no interest in following its narrative threads, acknowledging the comic-book histories of "Batman Returns" characters, or considering the opinions of their fans, saying: "We were really just about the art." Unlike Hamm, Waters was not opposed to Batman killing people, believing the character should reflect contemporary, darker times, and that the idea of a hero leaving captured villains for the authorities was outdated. Even so, Waters only had Batman kill when necessary so it would be more meaningful; he was unhappy with some of the unscripted on-screen deaths in the finished film, such as Batman blowing up a Red Triangle member. Much of Waters's "bitter and cynical" dialogue for Batman (such as Gotham City not deserving protection) was removed because Keaton said that Batman should rarely speak in costume and Burton wanted Batman to be driven by trauma not nihilism. As a result, the script focused on villains. Burton said that he initially struggled to understand the appeal of the Penguin's comic-book counterpart; Batman, Catwoman, and the Joker had clear psychological profiles, but the Penguin was "just this guy with a cigarette and a top hat." The initial draft made the character resemble a stereotypical DeVito character (an abrasive gangster), but Waters and Burton agreed to make him more "animalistic". They decided to make the Penguin a tragic figure, abandoned as an infant by his parents—a reflection of Batman's childhood trauma of losing his parents. Political and social satire was added, influenced by two episodes of the 1960s television series, "Batman", ("Hizzoner the Penguin" and "Dizhonner the Penguin") in which the Penguin runs for mayor. Waters changed Hamm's Catwoman from a "fetishy sexual fantasy" "femme fatale" to a working-class, disenchanted secretary, writing her as an allegory of contemporary feminism. Although the character is influenced by feline mythology (such as cats having nine lives), Waters and Burton never intended the supernatural elements to be taken literally and planned for Catwoman to die with Shreck during the electrical explosion in the film's denouement. Waters created Max Shreck—an original character named in honor of actor Max Schreck—to take the place of Harvey Dent/Two-Face. Shreck was written satirically as an evil industrialist who orchestrates the Penguin's mayoral run, in order to convey the message that true villains do not always wear costumes. In one version of the script, Shreck was the Penguin's more-favored brother. With four central characters to depict, Waters and Burton decided to remove Robin, a garage mechanic who helps Batman after Penguin crashes the Batmobile. They were not particularly interested in retaining the character, whom Waters described as worthless. The Red Triangle gang, initially conceived as a troupe of performance artists, were changed to circus clowns at Burton's request. Waters said that his 160-page first draft was too outlandish and would have cost $400million to produce, leading him to become more restrained. His fifth (and final draft) focused more on characterization and interaction than on plot. Burton and Waters eventually fell out over disagreements about the script and Waters's refusal to implement requested changes. Burton hired Wesley Strick to refine Waters's work, streamline dialogue, and lighten the tone. Warner Bros. executives mandated that Strick introduce a master plan for the Penguin, resulting in the addition of the plot to kidnap Gotham's first-born sons and threaten the city with missiles. Waters said that the changes to his work were relatively minor, but he was baffled by the Penguin's master plan. He made a final revision to Strick's shooting screenplay and, although Strick was on set for four months of filming and agreed-upon rewriting, Waters was the only screenwriter credited. Casting. Keaton reprised his role as Bruce Wayne / Batman for $10million, double his salary for "Batman". Burton wanted to cast Marlon Brando as the Penguin, but Warner Bros. preferred Dustin Hoffman. Christopher Lloyd and Robert De Niro were also considered, but Danny DeVito became the frontrunner when Waters re-envisioned the character as a deformed human-bird hybrid. DeVito was initially reluctant to accept the role until he was convinced by his close friend, Jack Nicholson, who played the Joker in "Batman". To convey his vision, Burton gave DeVito a picture he had painted of a diminutive character sitting on a red-and-white striped ball with the caption, "my name is Jimmy, but my friends call me the hideous penguin boy." Casting Selina Kyle / Catwoman was difficult. Annette Bening initially secured the role, but had to drop out after becoming pregnant. Actresses lobbying for the part then included Ellen Barkin, Cher, Bridget Fonda, Jennifer Jason Leigh, Madonna, Julie Newmar, Lena Olin, Susan Sarandon, Raquel Welch, and Basinger. The most prominent candidate, however, was Sean Young (who was cast as Vale in "Batman" before she was injured). Young went to the Warner Bros. lot in a homemade Catwoman costume for an impromptu audition for Burton, who reportedly hid under his desk (although Keaton and producer Mark Canton briefly met with her). She shared video of her efforts with "Entertainment Tonight". She also pitched in costume on "The Joan Rivers Show". Warner Bros. said that Young did not fit their vision for Catwoman. The role went to Pfeiffer who was described as a proven actress who got along with Burton (although some publications said that it would stretch her acting abilities). Pfeiffer had also been considered for Vale in "Batman", but Keaton vetoed the casting because they had been romantically involved and he believed that her presence would interfere with attempts to reconcile with his wife. She received a $3million salary ($2million more than Bening), plus a percentage of the gross profits. Pfeiffer trained for months in kickboxing with her stunt double, Kathy Long, mastering the whip and becoming proficient enough to perform her own stunts with the weapon. Shreck's appearance was modeled on Vincent Price in an (unnamed) older film, and Walken based his performance on moguls such as Sol Hurok and Samuel Goldwyn. He said, "I tend to play mostly villains and twisted people. Unsavory guys. I think it's my face, the way I look." Burgess Meredith (who played the Penguin in the 1960s TV series) was scheduled to make a cameo appearance as Penguin's father, Tucker Cobblepot, but became ill during filming. He was replaced by Paul Reubens; Diane Salinger played his wife, Esther. Both had starred in Burton's feature-film debut, "Pee-wee's Big Adventure" (1985). Although Robin was removed from the screenplay, the character's development was far enough along that Marlon Wayans was cast in the role (Burton had specifically wanted an African-American Robin) and costumes, sets, and action figures were made. In a 1998 interview, Wayans said that he still received residual checks as part of the two-film contract he signed. Early reports suggested that Nicholson had been asked to return as the Joker, but refused to film in England because of the salary tax on foreign talent. Nicholson denied being asked, however, believing that Warner Bros. would not want to replicate his generous compensation for "Batman". Filming. Principal photography began on September 3, 1991. Burton wanted to film in the United States with American actors because he believed that "Batman", which had been filmed in the United Kingdom, had "suffered from a British subtext." The economics of filming "Batman" in the United Kingdom had also changed, making it more cost-effective to remain in the U.S. This meant abandoning the Pinewood Studios sets in favor of Burton's new design. "Batman Returns" was filmed entirely on up to eight soundstages at Warner Bros. Studios, Burbank, California, including Stage 16 (which housed the expansive Gotham Plaza set). An additional soundstage, Stage 12 at the Universal Studios Lot, was used for the Penguin's Arctic-exhibit lair. Some sets were kept very cold for the live Emperor, black-footed, and King penguins. The birds were flown in on a refrigerated airplane for filming, and had a chilled waiting area containing a swimming pool stocked with half a ton of ice daily and fresh fish. DeVito said that he generally liked being on set but disliked the cold conditions, and was the only person somewhat comfortable because of his costume's heavy padding. To create the penguin army, the live penguins were supplemented with puppets, forty Emperor-penguin suits worn by little people, and Computer-generated imagery (CGI). People for the Ethical Treatment of Animals (PETA) protested the use of real penguins, objecting to the birds being moved from their natural environment. Although the organization had reportedly said that the penguins were not mistreated during filming, it later complained that the birds did not get fresh drinking waterjust a small, chlorinated pool. PETA also objected to the penguins being fitted with appliances representing weapons and gadgets, which Warner Bros. said were lightweight plastic. Burton said that he did not like using real animals because he had an affinity for them, and ensured that the penguins were treated with care. Walken described the filming as very collaborative, recalling that his suggestion to add a blueprint for Shreck's power plant resulted in a model being built within a few hours. The scene of Catwoman putting a live bird in her mouth was performed live, with no CGI enhancements. Pfeiffer said that, in retrospect, she would not have done the stunt as she had not considered the risks of injury or disease involved. For a scene in the sewers, monkey handlers positioned above and below managed the organ grinder monkey as it descended a set of stairs with a note for Penguin. When it saw DeVito in full costume and makeup, it leapt at his testicles. DeVito said, "The monkey looked at me, froze, and then leapt right at my balls...Thank god it was a padded costume." A scene of Shreck's superstore exploding caused minor injuries to four stuntmen. Principal photography ended on February 20, 1992, after 170 days. Post-production. Chris Lebenzon edited "Batman Returns" 126-minute theatrical cut. The final scene of Catwoman looking up at the Bat Signal was filmed during post-production, only two weeks before the film's release. Warner Bros. mandated the scene (depicting that the character survived) after test audiences responded positively to Pfeiffer's performance. Pfeiffer was unavailable to film the scene, and a stand-in was used. A scene of Penguin's gang destroying a store filled with Batman merchandise was removed. Warner Bros. provided a final budget for "Batman Returns" of $55million, although it has been reported (or estimated) as $50, $65, $75, or $80million. Music. Danny Elfman was initially reluctant to score "Batman Returns" because he was unhappy that his "Batman" score was supplemented with pop music by Prince. Elfman built on many of his "Batman" themes, and said that he enjoyed working on the Penguin's themes the most because of the character's sympathetic aspects, such as his abandonment and death. Recorded with a studio orchestra on the Sony Scoring Stage in Los Angeles, Elfman's score includes vocals, harps, bells, xylophones, flutes, pianos, and chimes. The song "Face to Face", played during the costume-ball scene, was co-written and performed by the British rock band Siouxsie and the Banshees. Burton and Elfman fell out during production due to the stress of finishing "Batman Returns" on time, but reconciled shortly afterward. Design and effects. "Batman" production designer Anton Furst was replaced by Bo Welch, who understood Burton's visual intentions after previous collaborations on "Beetlejuice" (1988) and "Edward Scissorhands" (1990). Furst, already occupied on another project, committed suicide in November 1991. Warner Bros. maintained a high level of security for "Batman Returns", requiring the art department to keep their window blinds closed. Cast and crew had to wear ID badges with the film's working title, "Dictel", a word coined by Welch and Burton meaning "dictatorial"; they were unhappy with the studio's "ridiculous gestapo" measures. Welch designed the Batboat vehicle, a programmable batarang, and the Penguin's weaponized umbrellas. He added features to the Batmobile, such as detaching much of its exterior to fit through tighter spaces; this version was called the "Batmissile". Sets. The sets were redesigned in Welch's style, including the Batcave and Wayne Manor. They were spread across seven soundstages on the Warner Bros. lot (the largest of which had ceilings) and the largest set owned by Universal Pictures. "Batman Returns" was filmed on sets, although some panoramic shots (such as the camera traveling from the base of Shreck's department store to its cat-head-shaped office) were created with detailed miniatures. Welch found it difficult to create something new without deviating from Furst's award-winning work. The designs were intended to appear as a separate district of Gotham; if "Batman" took place on the East Side, "Batman Returns" was set on the West Side. Welch was influenced by German Expressionism, neo-fascist architecture (including Nazi Germany-era styles), American precisionism painters, and photos of the homeless living on the streets in affluent areas. He incorporated Burton's rough sketch of Catwoman, which had a "very S&M kind of look", by adding chains and steel elements which would appear to hold together a city on the verge of collapse. The key element for Welch came early in design, when he realized that he wanted to manipulate spaces to convey specific emotions (emphasizing vertical buildings to convey a "huge, overwhelmingly corrupt, decaying city" filled with small people): "The film is about this alienating, disparate world we live in." The wintertime setting took advantage of the contrast between black and white scene elements, influenced by "Citizen Kane" (1941) and "The Third Man" (1949). Welch's concept designs began by carving out building shapes from cardboard with images of fascist sculptures and depression era machine-age art. The resulting -tall rough model represented Gotham Plaza, described as a futuristic, oppressive, and "demented caricature" of Rockefeller Center. It was designed overbuilt, emphasizing the generic-but-oppressive heart of Gotham's corruption. Despite complaints from the film's financiers about its necessity, Burton insisted on the location with a detailed church overshadowed by plain surroundings. Designs attempted to create the illusion of space; the Wayne Manor set was partially built (consisting primarily of a large staircase and fireplace) with a scale which implied that the rest of the structure was massive. Penguin's base was initially scheduled to be built in a standard tall Warner Bros. soundstage, but Welch thought that it lacked "majesty" and did not create enough contrast between itself and the "evil, filthy, little bug of a man". A -tall Universal stage was acquired for the production, its raised ceilings making it seem more realistic and less like a set. Minor modifications were made to the set throughout the film to make it appear to be gradually deteriorating. The location featured a water tank filled with of water surrounding a faux-ice island. Selina Kyle's apartment had a large steel beam running through its center to appear as if it had been built around a steel girder, which Welch said made it depressing and ironic. The wood used to build the sets was donated to Habitat for Humanity to help build low-cost homes for the poor. Costumes and makeup. Bob Ringwood and Mary E. Vogt were the costume designers. They refined the Batsuit to create the illusion of mechanical parts built into the torso, intending Batman to resemble Darth Vader. Forty-eight foam-rubber Batsuits were made for "Batman Returns". They had a mechanical system of bolts and spikes beneath the breast plate to secure the cowl and cape because "otherwise, if [Keaton] turned around quickly the cape would stay where it was", due to its weight. Costumer Paul Barrett-Brown said that the suit had a "generous codpiece" for comfort, and initially included a zippered fly to allow Keaton to use the bathroom; the actor declined, however, because it could be seen by the camera from some angles. As with the "Batman" costume, Keaton could not turn his head; he compensated by making bolder, more powerful movements with his lower body. The Catwoman outfit was made from latex because it was designed to be "black and sexy and tight and shiny". The material was chosen because of its association with "erotic and sexual" situations, reflecting the character's transition from a repressed secretary to an extroverted, erotic female. Padding was added because Pfeiffer was less physically endowed than Bening; this worked to Pfeiffer's advantage, however, since Barrett-Brown said that if it was too tight it "would reveal the genital area so thoroughly that you'd get an X certificate." Ringwood and Vogt thought that if the latex material tore it would not be difficult to repair; forty to seventy backup Catwoman suits were made by Western Costume, the Warner Bros. costume department, and Los Angeles-based clothing manufacturer Syren at a cost of $1,000 each. Other versions, made for Pfeiffer from a cast of her body, were so tight that she had to be covered in baby powder to wear them. Barrett-Brown said that because of the material, it was possible to get into the suit when dry; they could not re-use them, however, because of sweat and body oils. Vin Burnham constructed Catwoman's headpiece and mask. Burton was influenced to add stitching by calico cats, but the stitching came apart. Ringwood and Vogt struggled with adding stitching to latex. They tried to sculpt stitching and glue it on, but did not like the look and went over the suit with liquid silicon while it was worn (which added a shine to everything). Pfeiffer said that the suit was like a second skin, but when worn for long periods it was uncomfortable; there was no way to use the restroom and it would stick to her skin, occasionally causing a rash. She found the mask similarly confining, describing it as choking her or "smashing my face", and would catch the claws on nearby objects. Stan Winston Studio created an "over-the-top Burtonesque" visual for the Penguin, without obscuring DeVito's face. Concept artist Mark McCreery drew a number of sketches for the look, from which Legacy Effects built noses on a lifecast of DeVito's face. Winston was unhappy with the "pointy nose" shapes and began sculpting ideas with clay, influenced by his work on "The Wiz" (1978) (which involved a forehead and brow prosthetic appliance for large-beaked creatures). The final makeup included a T-shaped appliance which went over DeVito's nose, lip and brow as well as crooked teeth, whitened skin and dark circles under his eyes. Ve Neill applied the makeup, made by John Rosengrant and Shane Mahan. The several pounds of facial prosthetics, body padding, and prosthetic hands took four-and-a-half hours to apply to DeVito, but was reduced to three hours by the end of filming. An air bladder was added to the costume to help reduce its weight. DeVito helped create the Penguin's black saliva with the makeup and effects teams, using a mild mouthwash and food coloring which he squirted into his mouth before filming, and said its taste was acceptable. Burton described DeVito as completely in-character in costume, and he "scared everybody". While re-dubbing some of his dialogue, DeVito struggled to get into character without the makeup and had it applied to improve his performance. Because of the secrecy surrounding his character's appearance before marketing, DeVito was not allowed to discuss it with others (including his family). A photo leaked to the press, and Warner Bros. employed a firm of private investigators in a failed attempt to track down the source. Penguins. Stan Winston Studio provided animatronic penguins and costumes to supplement Penguin's army. Thirty animatronic versions were made: ten each of the black-footed, King, and Emperor penguins. Costumes worn by little people were slightly larger than the animatronics; the actors controlled walking, the mechanized heads were remote-controlled and the wings were puppeteered. Dyed black chicken feathers were used for the penguin bodies. McCreery's designs for the penguin army initially included a flamethrower, which was replaced with a rocket launcher. Mechanical-effects designers Richard Landon and Craig Caton-Largent supervised the manufacture of the animatronics, which required nearly 200 different mechanical parts to control the head, neck, eyes, beak, and wings. Boss Film Studios produced the CGI penguins. Release. Context. By the theatrical summer of 1992 (beginning the last week of May), the film industry was struggling with low ticket sales, rising production costs, and several box-office failures the previous year. Eighty-nine films were scheduled for release during the season, including "A League of Their Own", "Alien 3", "Encino Man", "Far and Away", "Patriot Games", and "Sister Act". Studios had to carefully schedule their releases to avoid competition from anticipated blockbusters, such as "Lethal Weapon 3" and "Batman Returns", as well as the 1992 Summer Olympics. "Batman Returns" was predicted to be the summer's biggest success, and other studios were reportedly concerned about releasing their films within even a few weeks of its premiere. Paramount Pictures increased the budget of "Patriot Games" by $14million just to make it more competitive with "Batman Returns" and "Lethal Weapon 3". Marketing. Franchising had not been considered an important aspect of "Batman" prior to its release. However, after merchandise contributed about $500million to its $1.5billion total earnings, it was prioritized for "Batman Returns". Warner Bros. delayed major promotion until February 1992, to avoid over-saturation and the risk of driving away audiences. A 12-minute promotional reel debuted at WorldCon in September 1991, alongside a black-and-white poster of a silhouetted Batman, which was called "mundane" and uninspiring. A trailer was released in 5,000 theaters in February 1992 with a new poster of a snow-swept Batman logo. The campaign focused on the three central characters (Batman, Penguin, and Catwoman), which Warner Bros. believed would offset the loss of the popular Nicholson. Over two-thirds of the 300 posters Warner Bros. installed in public places were stolen. Warner Bros. eventually offered 200 limited-edition posters for $250, signed by Keaton, who donated his earnings to charity. Over $100million was expected to be spent on marketing, including $20million by Warner Bros. for commercials and trailers, and $60million by merchandising partners. The partners, which included McDonald's, Ralston Purina, Kmart, Target Corporation, Venture Stores, and Sears, planned to host about 300 in-store Batman shops. McDonald's converted 9,000 outlets into Gotham City restaurants, offering Batman-themed packaging and a cup lid which doubled as a flying disc. CBS aired a television special, "The Bat, The Cat, The Penguin... Batman Returns", and Choice Hotels sponsored the hour-long "The Making of Batman Returns". Television advertisements featured Batman and Catwoman fighting over a can of Diet Coke, and the Penguin (and his penguins) promoted Choice Hotels. Advertisements also appeared on billboards and in print (three consecutive pages in some newspapers), targeted at older audiences. Box office. "Batman Returns" premiered on June 16, 1992, at Grauman's Chinese Theatre in Hollywood. Two blocks of Hollywood Boulevard were closed for over 3,000 fans, 33 TV film crews, and 100 photographers. A party was held afterwards on the Stage 16 Gotham Plaza set for guests who included Keaton, Pfeiffer, DeVito, Burton, DiNovi, Arnold Schwarzenegger, Faye Dunaway, James Caan, Mickey Rooney, Harvey Keitel, Christian Slater, James Woods, and Reubens. The film had a limited, preview release in the U.S. and Canada on Thursday, June 18, earning $2million. It had a wide release the following day, and was shown on an above-average 3,000 screens in 2,644 theaters. "Batman Returns" earned $45.7million during its opening weekend (an average of $17,729 per theater), and was the number-one film—ahead of "Sister Act" fourth weekend ($7.8million) and "Patriot Games" third ($7.7million). This figure broke the record for the highest-grossing opening weekend, set by "Batman" ($42.7million). The film held this record until the release of "Jurassic Park" ($50.1million) the next year. Initial performance analysis suggested that "Batman Returns" could become one of the all-time highest-grossing films; Warner Bros. executive Robert Friedman said, "We opened it the first real weekend when kids are out of school. The audience is everybody, but the engine that drives the charge are kids under 20." According to "Patriot Games" producer Mace Neufeld, other films benefited from overflow audiences for "Batman Returns" who did not want to wait in long lines or were turned away from sold-out screenings. "Batman Returns" earned $25.4million in its second weekend (a 44.3-percent drop) and was the number-one film again, ahead of the premiering "Unlawful Entry" ($10.1million) and "Sister Act" ($7.2million). By the film's third weekend, it was the second=fastest film to gross $100million (11 days), behind "Batman" (10 days). It remained the number-one film with a gross of $13.8million (a 45.6-percent drop), ahead of the premiering "A League of Their Own" ($13.7million) and "Boomerang" ($13.6million). "The Washington Post" called its week-over-week drops troublesome, and industry analysis suggested that "Batman Returns" would not replicate the longevity of "Batman"s theatrical run. "Batman Returns" never regained the number-one position after falling to numberfour over its fourth weekend, and left the top-ten highest-grossing films by its seventh. The film left theaters in late October after 18 weeks, with a total gross of $162.8million. It became the third-highest-grossing film of 1992, behind "" ($173.6million) and "Aladdin" ($217.3million). "Batman Returns" earned an estimated $104million outside the U.S. and Canada, including a record-setting £2.8million opening weekend in the United Kingdom. This broke the record set by "" (1991), making it the first film to gross more than £1million in a single day. Worldwide, "Batman Returns" grossed $266.8million, making it the sixth-highest-grossing film of 1992, ahead of "A Few Good Men" ($243.2million) and behind "Lethal Weapon 3" ($321.7million). Reception. Critical response. "Batman Returns" had a polarized reception from professional critics. Audiences polled by CinemaScore gave the film an average grade of B on an A+-to-F scale. Several reviewers compared "Batman Returns" and "Batman"; some suggested that the sequel had faster pacing and more comedy and depth, avoiding "Batman" "dourness" and "tedium". Critics generally agreed that Burton's creative control made "Batman Returns" a more personal work than "Batman", something "fearlessly" different which could be judged on its own merits. Critics such as Kenneth Turan, however, said that Burton's innovative, impressive visuals made "Batman Returns" feel cheerless, claustrophobic and unexciting, and were often emphasized at the expense of the plot. According to Owen Gleiberman, Burton's fantastic elements were undermined because he did not establish a base of normality. The plot had a mixed response. Some reviewers praised the first and second acts and interesting characters who could evoke audience emotion. Others said that it lacked suspense, thrills, or clever writing, overwhelmed by too many characters and near-constant banter. The ending was criticized for lackluster action and failing to bring the separate character threads to a satisfactory conclusion. According to Janet Maslin, Burton cared mainly about visuals and plot was a secondary consideration. Gene Siskel said that the emphasis on characterization was detrimental; the sympathetic villains left him hoping that Batman would not win, and each character would find emotional peace. Reviewers generally agreed that despite Keaton's abilities, his character was ignored by the script in favor of the villains; scenes without him were among the best. Todd McCarthy described Batman as a symbol of good rather than a psychologically complete character, and Ebert wrote that "Batman Returns" depicts being Batman as a curse instead of a heroic power fantasy. Peter Travers, however, said that Keaton's "manic depressive hero" was a deep, realized character in spite of the film's faster pace. DeVito was praised for his energy, unique characterization, and ability to convey his character's tragedy despite the costumes and prosthetics. Desson Howe said that Burton's focus on the Penguin indicated his sympathy for the character. Some reviewers considered DeVito an inferior followup to Nicholson's Joker, who evoked sympathy without instilling fear. Pfeiffer received near-unanimous praise for the film's standout performance as a passionate, sexy, ambitious, intelligent, intimidating, and fierce embodiment of feminism who offered the only respite from the otherwise-dark tone. Jonathan Rosenbaum, however, said that she did not live up to Nicholson's villain. Turan called the scenes shared by Batman and Catwoman the film's most interesting, and Travers said that when they take off their masks at the end they look "lost and touchingly human". Burr described the ballroom scene (in which they realize each other's secret identities) as more emotional than anything in "Batman". Ebert noted that their sexual tension seemed to have been undercut for a younger audience. Walken's performance was described as "wonderfully debonair", funny and engaging, a villain who could have carried "Batman Returns" alone. Welch's production design was generally praised, offering a sleeker, brighter, more authoritarian visual style than Furst's "brooding", oppressive aesthetic. McCarthy described Welch's ability to realize Burton's imaginative universe as an achievement, although Gene Siskel described Welch as a "toy shop window decorator" compared to Furst. The costumes and makeup effects were also praised, with Maslin saying that those images would linger in the imagination long after the narrative was forgotten. Czapsky's cinematography was well-received, even giving a "lively" aesthetic to the subterranean sets. The film's violent, mature, sexual content, such as kidnappings and implied child murder, was criticized as inappropriate for younger audiences. Accolades. At the 46th British Academy Film Awards, "Batman Returns" was nominated for Best Makeup (Ve Neill and Stan Winston) and Best Special Visual Effects (Michael Fink, Craig Barron, John Bruno, and Dennis Skotak). For the 65th Academy Awards, "Batman Returns" received two nomations: Best Makeup (Neill, Ronnie Specter, and Winston) and Best Visual Effects (Fink, Barron, Bruno, and Skotak), but lost both awards to "Bram Stoker's Dracula" and "Death Becomes Her" respectively. Neill and Winston received the Best Make-up award at the 19th Saturn Awards. The film received four other Saturn Award nominations for Best Fantasy Film, Best Supporting Actor (DeVito), Best Director (Burton), and Best Costume Design (Bob Ringwood, Mary Vogt, and Vin Burnham). DeVito was nominated for Worst Supporting Actor at the 13th Golden Raspberry Awards, and Pfeiffer for Most Desirable Female at the 1993 MTV Movie Awards. "Batman Returns" was nominated for a Hugo Award for Best Dramatic Presentation. After release. Performance analysis and aftermath. The U.S. and Canadian box offices underperformed in 1992, with admissions down by up to five percent and about 290million tickets sold (compared to over 300million in each of the preceding four years). Industry professionals blamed the drop on the lack of quality of the films being released, considering them too derivative or dull to attract audiences. Even films considered successful had significant box-office drops week over week from apparently-negative word of mouth. Industry executive Frank Price said that the releases were not attracting the younger audiences and children which were vital to a film's success. Rising ticket prices, competition from the Olympics, and an economic recession were also considered contributing factors to the declining figures. "Batman Returns" and "Lethal Weapon 3" contributed to Warner Bros. best first half-year in its history, and were expected to return over $200million to the studio from the box office. "Batman Returns" was considered a disappointment as a sequel to the fifth-highest-grossing film ever made, however, and fell about $114.8million short of "Batman"s $411.6million theatrical gross. By July 1992, anonymous Warner Bros. executives reportedly said about the film, "It's too dark. It's not a lot of fun." Despite its PG-13 rating from the Motion Picture Association, warning parents that a film may contain strong content unsuitable for children, Warner Bros. received thousands of complaint letters from audiences, particularly parents, who disliked "Batman Returns" violent and sexualized content. Waters recalled the aftermath of one screening: "It's like kids crying, people acting like they've been punched in the stomach and like they've been mugged." He had anticipated, and enjoyed, some backlash, but acknowledged he may have made some mistakes. McDonald's was criticized for its child-centered promotion and toys, and discontinued its "Batman Returns" campaign in September 1992. Burton said that he preferred "Batman Returns" to "Batman", and thought it was less dark than its predecessor, despite the backlash. Although much of Hamm's work was replaced, he defended Burton and Waters, saying that except from the merchandise, "Batman Returns" was never presented as child-friendly. Warner Bros. decided to continue the series without Burton (described as "too dark and odd for them"), replacing him with Joel Schumacher. A rival studio executive said, "If you bring back Burton and Keaton, you're stuck with their vision. You can't expect "Honey, I Shrunk the Batman"" (referring to the 1989 science-fiction comedy, "Honey, I Shrunk the Kids"). Warner Bros. was sued by executive producers Benjamin Melniker and Michael Uslan, who alleged that they had originally purchased the film-adaptation rights to the Batman character but were denied their share of the profits from "Batman" and "Batman Returns" by the studio's Hollywood accounting: a method used by studios to artificially inflate a film's production costs, making it appear unprofitable and limiting royalty (or tax) payments. The court decided in the studio's favor, citing a lack of evidence. Home media. "Batman Returns" was released on VHS and LaserDisc on October 21, 1992. Its VHS version had a lower-than-average price, to encourage sales and rentals. The film was expected to sell millions of copies and be a well-performing rental, but its success would be restricted by its content, which would appeal less to children (the main audience driving purchases). Elfman's score was released in 1992 on compact disc (CD), and an expanded soundtrack was released in 2010. "Batman Returns" was released on DVD in 1997, with no additional features. An anthology DVD box set was released in October 2005, with all the films in the Burton-Schumacher Batman film series. The "Batman Returns" segment had commentary by Burton, "The Bat, The Cat, and The Penguin" special about the making of the film, part four of the documentary "Shadows of the Bat: The Cinematic Saga of the Dark Knight", notes on the development of costumes, make-up and special effects, and the music video for "Face to Face". The same anthology was released on Blu-ray in 2009 with a standalone "Batman Returns" Blu-ray release. A 4K Ultra HD Blu-ray version was released in 2019; restored from the original 35mm negative, it included the anthology's special features. A 4K collector's edition was released in 2022 with a SteelBook case (with original cover art), character cards, a double-sided poster, and previously released special features. Other media. About 120 products were marketed with "Batman Returns", including action figures and toys by Kenner Products, Catwoman-themed clothing, toothbrushes, roller skates, T-shirts, underwear, sunglasses, towels, beanbags, mugs, weightlifting gloves, throw pillows, cookie cutters, commemorative coins, playing cards, costume jewelry, cereal, a radio-controlled Batmobile, and tortilla chips shaped like the Batman logo. Although there were about the same number of products marketed for "Batman", there were fewer licensees so Warner Bros. could have more oversight. The release of "" later in 1992 was anticipated to extend merchandising success long after "Batman Returns" had left theaters. Warner Bros. used holographic labels developed by American Bank Note Holographics to detect counterfeit products. The film's novelization, by Craig Shaw Gardner, was published in July 1992. A roller coaster () was built at Six Flags Great America at a cost of $8million, and was later replicated at other Six Flags parks with a Batman stunt show. Several video-game adaptations titled "Batman Returns" were released by a number of developers on almost all available platforms; the Super Nintendo Entertainment System version was the most successful. To celebrate the 80th anniversary of the Penguin's first comic-book appearance, DeVito wrote "Bird Cat Love" a 2021 comic book story about the Penguin and Catwoman falling in love and ending the COVID-19 pandemic. "Batman '89", a comic-book series first released in 2022, continues the narrative of Burton's original two films and ignores the Schumacher sequels. Set a few years after the events of "Batman Returns", "Batman '89" depicts the transformation of district attorney Harvey Dent into Two-Face and introduces Robin. The series was written by Hamm, with art by Joe Quinones. The Red Triangle Gang made their first appearance outside "Batman Returns" in the 2022 comic book "Robin" #15. A holiday book was released in 2022, "Batman Returns: One Dark Christmas Eve: The Illustrated Holiday Classic", by Ivan Cohen. Thematic analysis. Duality. Critic David Crow called duality a major aspect of "Batman Returns", and Catwoman, Penguin, and Shreck represent warped, reflected aspects of Batman. Like Wayne (Batman), Selina (Catwoman) is driven by trauma and conflicted about her principles and desires; unlike Batman (who seeks justice), however, she seeks vengeance. Although Catwoman agrees with Batman's appeal that they are "the same, split right down the center", they still differ too much to be together. Critics Darren Mooney and Betsy Sharkey suggested that Penguin reflects Batman's origin; each lost their parents at an early age. Shreck says that if not for his abandonment, Cobblepot and Wayne might have traveled in the same social circles. Batman is content in his loneliness, however; the Penguin wants acceptance, love and respect, despite his quest for revenge. To Mooney, "Batman Returns" hints that Batman's issues with Penguin are personal rather than moral; Batman is quietly proud of being a "freak" (unique), and resents Penguin for displaying his "freakishness". Shreck represents Wayne's public persona if it was driven by greed, vanity, and self-interest: a populist industrialist who wins favor with cheap presents tossed into a crowd. Commercialism and loneliness. Crow saw "Batman Returns" as a denouncement of Batman's real-world cultural popularity and merchandising (especially in the wake of the previous film), and noted that a scene of a store filled with Batman merchandise being destroyed was removed from the film. Crow and Mooney wrote that "Batman Returns" is "saturated with Christmas energy"; it rejects the season's conventional norms and becomes an anti-Christmas film, however, critiquing its over-commercialism and lack of true goodwill. Shreck cynically exploits Christmas tropes for his own ends (falsely portraying himself as selfless and benevolent), and the perversions of Penguin's Red Triangle gang are a more overt rejection of the holiday. The film focuses on loneliness and isolation during Christmastime; Wayne is introduced sitting alone in his vast mansion, inert until the Bat-Signal shines in the sky. He makes a connection with Kyle, but what they share cannot overcome their differences and he ends the film as he began italone. Critic Todd McCarthy identified isolation as a theme common to much of Burton's work, which is emphasized in the three main characters. Rebecca Roiphe and Daniel Cooper wrote that "Batman Returns" was not antisemitic, but had antisemitic imagery. The Penguin, they believed, embodied Jewish stereotypes such as "... his hooked nose, pale face and lust for herring" and was "unathletic and seemingly unthreatening but who, in fact, wants to murder every firstborn child of the gentile community." The character joins forces with Shreck (who has a Jewish-sounding name) to disrupt and taint Christmas and Christian traditions. Sexuality and misogyny. "Batman Returns" has overtly sexual elements. Critic Tom Breihan described Catwoman's vinyl catsuit as "pure BDSM", including the whip she wields as a weapon. The dialogue is replete with double entendres, particularly by Penguin and Catwoman; in her fights with Batman, she sensuously licks his face. Selina / Catwoman is marginalized by the central male characters, however; Shreck pushes her out of a window, the Penguin tries to kill her when she spurns his advances, and Batman attempts to capture her. She fashions a catsuit to regain order, sanity, and power, but it is gradually damaged over the course of the film and her sanity decays with it. Catwoman's final choice is to reject Batman's offer of a happy ending by abandoning her revenge against Shreck; to surrender herself to Batman's will would allow another man to control her. Power and politics. Power is a central theme for several characters; Shreck says, "There's no such thing as too much power; if my life has a meaning that's the meaning." He uses his money to gain power, and Batman uses his fortune to fund his war against crime (unlike Penguin, who was abandoned because he did not fit the image expected by his wealthy parents). Kyle gains power by donning the Catwoman costume and embracing her anger and sexuality. Shreck convinces Penguin to run for mayor to further his own goals, and the Penguin seeks out the acceptance and respect it would give him. Critic Caryn James wrote that "Batman Returns" has "sharp political jabs" which implies that money and image are more important than anything else. In "Batman", the Joker buys citizen support by throwing them piles of money; in the sequel, Shreck and Penguin gain the support of the populace with spectacle, pandering, and corporate showmanship. The Penguin describes how he and Shreck are both seen as monsters, but Shreck is a "well-respected monster and I, to date, am not." James said that the Penguin wants to change the superficial perception of himself because he wants to be accepted, but has no interest in being lovable. Only when the fickle voters turn on him, however, does he resort to his plan to kill infants who had the chances he never had. Crow believed that Burton was the most sympathetic to Penguin, and spent the most time on the character. Legacy. Cultural influence. Retrospectives in the 2010s and 2020s noted that "Batman Returns" had developed an enduring legacy since its release, with "Comic Book Resources" describing it as the most iconic comic-book film ever made. Although initially criticized for its mix of the superhero and film noir genres, the film established trends toward dark tones and complex characters which have since become an expectation of many blockbusters. Some writers said that its "disturbing imagery", exploration of morality, and satire of corporate politics seemed even more relevant in the present day, as did the themes of prejudice and feminism explored in Catwoman. Burton said that he believed "Batman Returns" was exploring new territory at the time, but it might be considered "tame" by modern standards. According to "the Ringer", Burton's "weird and unsettling" sequel enabled future auteurs such as Christopher Nolan, Peter Jackson, and Sam Raimi to move into mainstream films. "Collider" described the film as the first "anti-blockbuster", defying expectations and delivering a superhero film with little action set during Christmas (despite its July release). The film's performances, score, and visual aesthetic are considered iconic, influencing Batman-related media and incarnations of the characters for decades (such as the "Batman Arkham" video games). "The Batman" (2022) director Matt Reeves and Batman actor Robert Pattinson called "Batman Returns" their favorite "Batman" film, with Reeves ranking it alongside "The Dark Knight" (2008), and director Robert Eggers said that it visually inspired his film "Nosferatu" (2024). Pfeiffer's Catwoman is considered iconic, a feat of characterization and performance which influenced subsequent female-superhero-led films. Her performance is generally regarded as the best cinematic adaptation of the character (influencing future portrayals such as Zoë Kravitz's in "The Batman"), one of the best comic book film characters, and among the greatest cinematic villains. In 2022, "Variety" ranked Pfeiffer's Catwoman as the second-best superhero performance of the preceding fifty years, behind Heath Ledger. DeVito's performance as the Penguin is also considered iconic, and has been listed by some publications as one of the best cinematic Batman villains. Modern reception. In the years since its release, "Batman Returns" has been positively reappraised. It is now regarded as among the best superhero films ever made, the best sequels, and the best "Batman" films made. "Screen Rant" called it the best Batman film of the 20th century and, in 2018, "Total Film" named it the best Batman film. "Batman Returns" was number401 on "Empire"s 2008 list of the 500 greatest movies of all time. Some publications have identified "Batman Returns" as part of Burton's unofficial Christmas trilogy, bookended by "Edward Scissorhands" and "The Nightmare Before Christmas", and it has become an alternative-holiday film along with films such as "Die Hard" (1988). Some publications have also listed it as one of the best Christmas films. The film's writer Daniel Waters recalled being told that "Batman Returns" was a "great movie for people who don't like Batman". Although the film was criticized for depicting Batman killing people, Waters said, "To me, Batman not killing [the Joker (played by Heath Ledger)] at the end of "The Dark Knight" after proving he can get out of any prison, it's like 'Come on. Kill Heath Ledger. He believed that the reception to "Batman Returns" was improving with time, especially after the release of "The Batman" in 2022. Critic Brian Tallerico said that the elements which originally upset critics and audiences are what makes it still "revelatory... It's one of the best and strangest movies of its kind ever made." Review aggregator Rotten Tomatoes has an approval rating from reviews by critics, with an average score of . According to the website's critical consensus, "Director Tim Burton's dark, brooding atmosphere, Michael Keaton's work as the tormented hero, and the flawless casting of Danny DeVito as The Penguin and Christopher Walken as, well, Christopher Walken make the sequel better than the first." The film has a score of 68 out of 100 on Metacritic (based on 23 critics), indicating "generally favorable" reviews. Sequels. Following the reception of "Batman Returns", Warner Bros. intended to continue the series without Burton. Burton considered making a third film, but the studio encouraged him to make something else and he realized they did not want him to return for a sequel. The studio replaced Burton with Schumacher, who could make something more family- and merchandise-friendly. Although Burton and Keaton said they were supportive of the new director, Keaton also left the series because "[the film] just wasn't any good, man." Industry press suggested that Keaton had also asked for a $15million salary and a percentage of the profits, although his producing partner Harry Colomby said that money was not the issue. Burton was an executive producer for the third film, "Batman Forever" (1995), which had a more mixed reception than "Batman Returns" but was a financial success. The fourth and final film, "Batman & Robin" (1997), was a financial and critical failure and is regarded as one of the worst blockbuster films ever made. It stalled the Batman film series for eight years until the reboot, "Batman Begins" (2005). By the mid-1990s, Burton and Waters were signed to direct a Catwoman-centered film starring Pfeiffer. Waters's plot depicted Catwoman as an amnesiac after her injuries at the end of "Batman Returns", who ends up in the Las Vegas-like Oasisburg and confronts publicly-virtuous male superheroes who are secretly corrupt. Burton and Pfeiffer took on other projects in the interim, and lost interest in the film. Warner Bros. eventually developed "Catwoman" (2004), starring Halle Berry, which was critically panned and is considered one of the worst comic-book films ever made. Keaton was scheduled to reprise his version of Batman in "Batgirl", a proposed 2022 film that was filmed but cancelled by Warner Bros. parent company, Warner Bros. Discovery. He appeared as Batman in "The Flash" (2023).
4729
9084172
https://en.wikipedia.org/wiki?curid=4729
Batman & Robin (film)
Batman & Robin is a 1997 American superhero film based on the DC Comics characters Batman and Robin by Bill Finger and Bob Kane. It is the fourth and final installment of Warner Bros.' initial "Batman" film series and a sequel to "Batman Forever" (1995). Directed by Joel Schumacher and written by Akiva Goldsman, it stars George Clooney as Bruce Wayne / Batman and Chris O'Donnell as Dick Grayson / Robin, alongside Arnold Schwarzenegger, Uma Thurman, and Alicia Silverstone. The film follows the eponymous characters as they attempt to prevent Mr. Freeze and Poison Ivy from taking over the world, while struggling to keep their partnership together. Warner Bros. fast-tracked development for "Batman & Robin" following the box office success of "Batman Forever". Schumacher and Goldsman conceived the storyline during pre-production on "A Time to Kill"; Schumacher was given a mandate to make the film more toyetic than its predecessor. After Val Kilmer decided not to reprise the role of Batman, Schumacher was interested in casting William Baldwin before George Clooney won the role. Principal photography began in September 1996 and wrapped in January 1997, two weeks ahead of the shooting schedule. "Batman & Robin" is the only film in the "Batman" film series made without the involvement of Tim Burton in any capacity. "Batman & Robin" premiered in Los Angeles on June 12, 1997, and went into general release on June 20. It grossed $238 million worldwide against a production budget of $125–160 million, and was considered a box office disappointment at the time. The film received generally negative reviews from critics and is considered to be one of the worst films ever made. The film's poor reception caused Warner Bros. to cancel future "Batman" films, including Schumacher's planned "Batman Unchained". One of the songs recorded for the film, "The End Is the Beginning Is the End" by the Smashing Pumpkins, won a Grammy Award for Best Hard Rock Performance at the 40th Annual Grammy Awards. Plot. Batman and his partner, Robin, encounter a new villain, Mr. Freeze, who has left a string of diamond thefts in his wake. During a confrontation at the natural history museum, Freeze steals a large diamond and flees, freezing Robin and leaving Batman unable to pursue him. Later, Batman and Robin learn that Freeze was originally Doctor Victor Fries, a scientist working to develop a cure for a disease known as MacGregor's syndrome, hoping to heal his terminally ill wife, Nora. After a lab accident, Fries was rendered unable to live at average temperatures and forced to wear a cryogenic suit powered by diamonds for survival. At a Wayne Enterprises lab in Brazil, botanist Doctor Pamela Isley is working under the deranged Doctor Jason Woodrue, who has turned her research on plants into the supersoldier drug Venom. After witnessing Woodrue use the formula to turn serial killer Antonio Diego into the hulking Bane, she threatens to expose Woodrue's experiments. Woodrue attempts to kill her by overturning a shelf of various toxins; instead, Isley is mutated by the toxins into Poison Ivy, who kills Woodrue with a poisonous kiss, destroys the lab and escapes to Gotham City with Bane, concocting a plot to use Wayne's money to support her research. Meanwhile, Alfred Pennyworth's niece, Barbara Wilson, makes a surprise visit and is invited by Bruce to stay at Wayne Manor until she goes back to school. Wayne Enterprises presents a new telescope for Gotham Observatory at a press conference interrupted by Ivy. She proposes a project that could help the environment, but Bruce declines her offer, knowing it could result in genocide. Batman and Robin decide to lure Freeze out using the Wayne Family diamonds and present them at a Wayne Enterprises charity event. Ivy attends the event and decides to use her abilities to seduce Batman and Robin. Freeze crashes the party but is defeated and incarcerated at Arkham Asylum. Ivy takes an interest in Freeze and helps him escape. Dick discovers that Barbara has been participating in drag races to raise money for Alfred, who is dying of MacGregor's syndrome; a fact he kept from Bruce and Dick, but his niece is secretly aware of his situation and is trying to find treatment for him. Batman, Robin, and the police arrive at Freeze's lair in response to his escape, discovering Nora preserved in a cryogenic chamber and that Freeze has developed a cure for the early stages of MacGregor's syndrome. The villains soon secretly arrive to recover Freeze's diamonds and Nora. Wanting Freeze for herself, Ivy cuts off the power to Nora's chamber, steals the diamonds, and seduces Robin, escalating tensions between him and Batman. At her hideout, Ivy convinces Freeze that Batman was responsible for her attempt at Nora's life and he then resolves to make humanity suffer for revenge, with Ivy plotting to repopulate Earth using her mutant plants afterward. Freeze and Bane commandeer Gotham Observatory and convert the new telescope into a giant freeze ray, while Ivy uses the Bat-Signal to contact Robin. Robin attempts to go after Ivy alone, but Batman convinces him not to fall for Ivy's seduction. Barbara discovers the Batcave, where an artificial intelligence version of Alfred reveals he has made a suit for Barbara. Barbara dons the suit and becomes Batgirl, arriving at Ivy's lair in time to help Batman and Robin subdue her. Freeze begins to freeze Gotham over whilst Batman, Robin, and Batgirl head to Gotham Observatory together to stop him. Batman defeats Freeze in combat, while Batgirl and Robin incapacitate Bane and thaw the city. Freeze accuses Batman of taking Nora's life, only to be shown a recording of Ivy admitting to the crime. Batman reveals that Nora survived and offers Freeze the chance to continue his research on MacGregor's syndrome in exchange for his cure. Freeze accepts and returns to Arkham, where he is imprisoned in the same cell as Ivy, upon whom he plans to take revenge. Alfred is cured and Bruce and Dick agree to let Barbara join them in fighting crime. Cast. John Glover portrays Dr. Jason Woodrue, a deranged scientist with a desire for world domination via his Venom-powered "supersoldiers", of whom Bane, portrayed by Robert Swenson, becomes Poison Ivy's bodyguard and muscle. Michael Reid MacKay plays Bane before he is injected with Venom. Vivica A. Fox and Vendela Kirsebom play Mr. Freeze's assistant and Nora Fries, Freeze's cryogenically frozen wife, respectively. Elizabeth Sanders appears as Gossip Gerty, Gotham's top gossip columnist. Michael Paul Chan and Kimberly Scott both appear as telescope scientists. Jesse Ventura and Ralf Moeller appear as Arkham Asylum guards. Coolio makes a cameo appearance, later stating that he was to reprise his role as Scarecrow in the ultimately cancelled sequel "Batman Unchained". Production. Development. With the box office success of "Batman Forever" in June 1995, Warner Bros. immediately commissioned a sequel. They hired director Joel Schumacher and writer Akiva Goldsman to reprise their duties the following August and decided it was best to fast-track production for a June 1997 target release date, which is a break from the usual three-year gap between films. Schumacher wanted to pay homage to the work of the classic "Batman" comic books of his childhood. The storyline of "Batman & Robin" was conceived by Schumacher and Goldsman during pre-production on "A Time to Kill". Portions of Mr. Freeze's backstory were based on the " episode ", written by Paul Dini. Goldsman, however, expressed concerns about the script during pre-production discussions with Schumacher. Schumacher stated that he was given the mandate by the studio to make the film more toyetic, even when compared to "Batman Forever". The studio reportedly included toy companies in pre-production meetings; Mr. Freeze's blaster was specifically designed by toy manufacturers. Batman creator Bob Kane acted as an official consultant and was heavily involved in the production; he gave input on the film's script as well as on set. While Chris O'Donnell reprised the role of Robin, Val Kilmer decided not to reprise the role of Batman from "Batman Forever". Schumacher admitted that he had difficulty working with Kilmer on "Batman Forever". "He sort of quit," Schumacher said, "and we sort of fired him." Schumacher would later go on to say that Kilmer wanted to work on "The Island of Dr. Moreau" because Marlon Brando was cast in the film. Kilmer said that he was not aware of the fast-track production and was already committed to "The Saint" and "Heat". David Duchovny stated he was considered for the role of Batman, joking that the reason why he was not chosen was because his nose was too big. George Clooney's casting as Batman was suggested by Warner Bros. executive Bob Daly. Schumacher originally had interest in casting William Baldwin in Kilmer's place, but chose Clooney after seeing his performance in "From Dusk till Dawn". Schumacher felt that Clooney "brought a real humanity and humor to the piece, an accessibility that I don't think anybody else has been able to offer" and that he strongly resembled the character from the comic books. Schumacher also believed that Clooney could provide a lighter interpretation of the character than Kilmer and Michael Keaton. As a consequence of time constraints, the costume department repurposed the costume worn by Val Kilmer in "Batman Forever" for the third act of the film. Ed Harris, Anthony Hopkins, and reportedly Patrick Stewart were considered for the role of Mr. Freeze, before the script was rewritten to accommodate Arnold Schwarzenegger's casting. Schumacher later denied that Stewart was ever considered. Schumacher decided that Mr. Freeze had to be "big and strong like he was chiseled out of a glacier". Mr. Freeze's armor was made by armorer Terry English, who estimated that the costume cost some $1.5 million to develop and make. To prepare for the role, Schwarzenegger wore a bald cap after declining to shave his head, wore a blue LED in his mouth, and had acrylic paint applied. The blue LEDs had to be wrapped in balloons after battery acid started leaking into Schwarzenegger's mouth. His prosthetic makeup and wardrobe took six hours to apply each day. The extensive time spent on Schwarzenegger's costume significantly restricted his shooting time as his contract was limited to 12 work hours a day. Schwarzenegger was paid a $25 million salary for the role. Beside Uma Thurman, Demi Moore, Sharon Stone, and Julia Roberts were considered for the role of Poison Ivy. Schumacher first became aware of Thurman through an earlier role as Venus in "The Adventures of Baron Munchausen". Thurman ultimately took the role of Poison Ivy because she liked the femme fatale characterization of the character. Alicia Silverstone was the only choice for the role of Batgirl. Prior to filming, she was reported to have lost at least 10 pounds for the role. Silverstone would later recount the body shaming she encountered during promotion of the film. Filming and visual effects. Principal photography was set to commence in August 1996, but did not begin until September 12, 1996. "Batman & Robin" finished filming in late January 1997, two weeks ahead of the shooting schedule. The shooting schedule allowed Clooney to simultaneously work on the television series "ER" without any scheduling conflicts. O'Donnell said that despite spending much time with Schwarzenegger off of set and during promotion for the film, they did not work a single day together during production; this was achieved by using stand-ins when one of the actors was unavailable. Stunt coordinator Alex Field taught Silverstone to ride a motorcycle so that she could play Batgirl. Filming was temporarily halted in the fall of 1996 when Mr. Freeze's blaster prop disappeared from the film set; a police investigation was subsequently opened, culminating in the raid of a film memorabilia collector's home. High public interest in the film caused security issues on set; according to producer Peter MacGregor-Scott, paparazzi regularly disrupted the set, and photographs of Schwarzenegger taken during filming sold for $10,000. When comparing work on "Batman Forever", O'Donnell explained that "things felt much sharper and more focused, and it just felt like everything got a little softer on the second one. The first one, I felt like I was making a movie. The second one, I felt like I was making a toy commercial." He also complained about the Robin costume, saying that it was more involved and less comfortable than the one that he wore in "Batman Forever", with a glued-on mask that caused sweat to pool on his face. According to John Glover, who played Dr. Jason Woodrue, "Joel [Schumacher] would sit on a crane with a megaphone and yell before each take, 'Remember, everyone, this is a cartoon'. It was hard to act because that kind of set the tone for the film." Several different stunt doubles were used for the roles of Batman, Robin, and Mr. Freeze, some specialized in ice skating, aerial gymnastics, and driving. The film was shot at Warner Bros. Studios in Burbank, California. The grounds of Greystone Mansion were used for scenes taking place at Wayne Manor. Part of the film was also shot in Vienna, Austria, Montreal, Quebec, and Ottawa, Ontario, Canada. Production designer Barbara Ling stated that her influences for the design of Gotham City came from "neon-ridden Tokyo and the Machine Age. Gotham is like a World's fair on ecstasy." Although miniatures and computer-generated elements were used for some scenes, large full-scale sets were constructed, including Gotham City covered in ice. For scenes featuring people frozen by Mr. Freeze's ice-ray, life-sized mannequins covered in fake ice were created. Several different materials were tested for the faux ice before settling on a combination of fiber resin. According to Ling, the ice effects alone took half a year to create. Rhythm & Hues Studios (R&H) and Pacific Data Images created the visual effects sequences, with John Dykstra and Andrew Adamson credited as the visual effects supervisors. "Batman & Robin" featured 450 individual visual effects shots, 150 more than "Batman Forever". Motion capture was used to animate digital stunt doubles; for a scene featuring skysurfing, the department recorded the motion of a skyboarder in a wind tunnel at a military base in North Carolina. Music. Elliot Goldenthal returned to score "Batman & Robin" after collaborating with Schumacher on "Batman Forever". The soundtrack features a variety of genres by various bands and performers, showcasing alternative rock on the lead single "The End Is the Beginning Is the End" by The Smashing Pumpkins, and with the songs "Lazy Eye" by Goo Goo Dolls and R.E.M.'s "Revolution". R&B singer R. Kelly wrote "Gotham City" for the soundtrack, which was featured in the end credits and was chosen as one of the singles, reaching the top 10 in the United States and the United Kingdom. Eric Benét and Meshell Ndegeocello also contributed R&B songs. Also included was the single, "Look into My Eyes" by the hip hop group Bone Thugs-n-Harmony, which reached the top 5. Other songs featured included electronic dance elements, including those by Moloko and Arkarna. The soundtrack was released on May 27, 1997, two weeks and three days ahead of the film's premiere in the United States. The orchestral score for the film was never commercially released. Lisa Schwarzbaum of "Entertainment Weekly" gave the soundtrack a "C" and called it "as incoherent as the Batman films themselves". Retrospectively, Nicole Drum of ComicBook.com described the soundtrack as a "colorful sampling of popular music at the time that feels messy, complicated, and comforting all at the same time". Filmtracks.com deemed the orchestral score an improvement over that of its predecessor "Batman Forever", noting that, while borrowing several themes from the previous film, Goldenthal successfully "expands upon the statements of his title theme and action material so that they are fleshed out into more accessibly enjoyable music". Nevertheless, the website compared Goldenthal's work negatively to Danny Elfman's scores for "Batman" and "Batman Returns". In an interview with "IGN", composer Hans Zimmer, who contributed the score to Christopher Nolan's trilogy of "Batman" films, called Goldenthal's theme "the most glorious statement of Batman I'd ever heard". "The End Is the Beginning Is the End" by The Smashing Pumpkins won a Grammy Award for Best Hard Rock Performance at the 40th Annual Grammy Awards. Release. "Batman & Robin" had its premiere on June 12, 1997, in Westwood, Los Angeles. The film marked the United Kingdom's then-"biggest and most expensive" movie premiere. The event was held at Battersea Power Station in London, with the building decorated to look like Gotham City and Wayne Manor. Expected to be among the tent poles of the summer movie season, the film opened in the United States on June 20, 1997, in 2,934 theaters, where it remained for an average of approximately 6.2 weeks. The film was released on DVD four months later on October 22, 1997. A special edition DVD was released in 2005 that included a documentary series about the production of the film series, "Shadows of the Bat: The Cinematic Saga of the Dark Knight". Marketing. The theatrical trailer for "Batman & Robin" debuted on the February 19, 1997, episode of "Entertainment Tonight". Warner Bros. spent $125 million to market and promote the film, in addition to its $160 million production budget. Several Six Flags amusement parks introduced new roller coasters themed to the film. opened at Six Flags Great Adventure in 1997, and a Mr. Freeze-themed roller coaster opened at both Six Flags Over Texas and Six Flags St. Louis in 1998. Taco Bell launched a $20 million promotional campaign for the film, selling Batman-themed cups, collector toys, and figurines. Themed trading cards produced by Fleer and SkyBox International were also sold, some signed by Clooney, Schwarzenegger, Thurman, Silverstone, O'Donnell, and Schumacher. An eponymous tie-in video game developed by Probe Entertainment was released for the PlayStation on August 5, 1998, to mixed reviews. Reception. Box office. "Batman & Robin" was released on June 20, 1997, in the United States and Canada, grossing $42,872,605 in its opening weekend. That made it the third-highest opening weekend gross of 1997, behind "Men in Black" and ', and the seventh-highest non-holiday opening weekend of all time as of its release. The film would hold the record for having the highest opening weekend for an Arnold Schwarzenegger film until 2003 when it was surpassed by '. Its opening weekend gross also remained George Clooney's highest until the release of "Gravity" in 2013. It reached the number one spot at the box office during its opening weekend, beating out "My Best Friend's Wedding" and "". This would become Schwarzenegger's most recent film to achieve this feat for five years until "Collateral Damage" opened in 2002. "Batman & Robin" declined by 63% in its second week, which was credited to poor word of mouth and early competition with "Face/Off", "Hercules", and "Men in Black". In the UK, it had the second-highest opening ever behind "Independence Day" with a gross of £4,940,566 ($8.2 million) for the weekend. The film went on to gross $107.4 million in the United States and Canada and $130.9 million internationally, coming to a worldwide total of $238.3 million. It grossed substantially less than the previous film in the series, and finished outside of the top ten films of 1997. With a production budget of $125–160 million, the film was considered to have under-performed at the box-office, although it was estimated to have at least broken even. Schumacher criticized "prejudicial prerelease buzz" online and false news reports as a cause for the film's poor commercial performance. Warner Bros. acknowledged "Batman & Robin"s shortcomings in the domestic market but pointed out its success in other markets. In his book "Batman: the Complete History", Les Daniels analyzed the film's relatively strong performance outside of the United States, speculating that "nuances of languages or personality were likely to be lost in translation and admittedly eye-popping spectacle seemed sufficient." Critical response. Audiences polled by CinemaScore gave the film an average grade of "C+" on an A+ to F scale. Jay Boyar of "Orlando Sentinel" believed "Batman & Robin" to be the least distinctive chapter in the series, calling it a "bat-smorgasbord of action, camp, pathos, spectacle and whatever" and blaming its blandness on the studio's increased involvement in its production. In his "thumbs down" review, Roger Ebert of the "Chicago Sun-Times" found the film to be "wonderful to look at" although it had "nothing authentic at its core", criticizing its toyetic approach. Writing for the "Chicago Tribune", Gene Siskel, who gave positive reviews to the previous "Batman" films, also gave "Batman & Robin" a "thumbs down" rating, calling it a "sniggering, exhausting, overproduced extravaganza". While commending the film's visuals, Kenneth Turan of the "Los Angeles Times" called the film "indifferently acted" and "far too slick for even a toehold's worth of connection", believing that it "killed" the "Batman" film series. Desson Howe of "The Washington Post" disapproved of Schumacher's direction and Akiva Goldsman's script, calling it an "emptily flashy, meandering fashion show of a summer flick" and also believing that it should mark the end to the series. Andrew Johnston, writing in "Time Out", remarked, "It's hard to tell who "B&R" is intended for. Anyone who knows the character from the comics or the superb on Fox will be alienated. And though Schumacher treats the Adam West version as gospel, that show's campy humor is completely incompatible with these production values." James Berardinelli questioned the "random amount of rubber nipples and camera angle close-ups of the Dynamic Duo's butts and Bat-crotches". In his review for the "San Francisco Chronicle", Mick LaSalle said that the film failed to "convincingly inhabit the grandeur of its art direction and special effects", criticizing George Clooney as "the big zero of the film", who "should go down in history as the George Lazenby of the series". While deeming Clooney "the most ideal Batman to date" in a physical sense, Todd McCarthy of "Variety" found the character uninteresting and Clooney "unable to compensate onscreen for the lack of dimension on paper". Conversely, he described Thurman and Schwarzenegger's performances as the villainous duo as the "highlights of the film", pointing out Thurman's "comic wit conspicuously lacking elsewhere in the picture". Writing for "Star Tribune", Jeff Strickler criticized its "almost embarrassingly mundane" dialogue and called Schwarzenegger "wasted" in the role of Mr. Freeze and his character "drably written". Janet Maslin of "The New York Times" gave a more positive review and praised Thurman's performance as "perfect", comparing it to Mae West's "[mix of] true femininity with the winking womanliness of a drag queen", but criticizing Silverstone and Clooney's performances. Steven Rea of "The Philadelphia Inquirer" found Thurman at times "amusing" and similarly described her performance as "Mae West with moss". Legacy. "Batman & Robin" is considered to be one of the worst superhero films and among the worst films ever made. In 2009, Marvel Studios president Kevin Feige said that "Batman & Robin" may be the most important comic book film ever made in that it was "so bad that it demanded a new way of doing things" and created the opportunity to make "X-Men" (2000) and "Spider-Man" (2002) in a way that respected the source material to a higher degree. In an interview with "Vice" 20 years after its release, director Joel Schumacher apologized for the film while taking full responsibility for its poor reputation, stating, "I want to apologize to every fan that was disappointed because I think I owe them that. A lot of it was my choice. No one is responsible for my mistakes but me." He added, "I was scum. It was like I had murdered a baby", recounting his initial reaction to the overwhelmingly negative public response. Screenwriter Akiva Goldsman also apologized, saying, "we didn't mean for it to be bad. I swear, nobody was like, 'This will be bad.'" and elaborating that the film was initially intended to be darker in tone. Retrospectively, George Clooney has spoken critically of and apologized for his involvement in the film, saying in 2005, "I think we might have killed the franchise", and calling it "a waste of money". In 2015, while promoting Disney's "Tomorrowland" at New York Comic Con, Clooney said that he had met former Batman actor Adam West and apologized to him for the film. Furthermore, when asked during a 2015 interview on "The Graham Norton Show" about whether he had ever had to apologize for "Batman & Robin", Clooney responded, "I always apologize for "Batman & Robin"". In late 2020, he told Howard Stern that it was "physically" painful to watch his work in the role: "The truth of the matter is, I was bad in it. Akiva Goldsman — who's won the Oscar for writing since then — he wrote the screenplay. And it's a terrible screenplay, he'll tell you. I'm terrible in it, I'll tell you. Joel Schumacher, who just passed away, directed it, and he'd say, 'Yeah, it didn't work.' We all whiffed on that one." In 2021, Clooney said he refuses to let his wife watch the film. In April 2023, Clooney was asked about what his biggest regret, he responded "I regret doing fucking Batman". Conversely, in an interview with "Empire" in 2012, Arnold Schwarzenegger stated that, despite its poor reception, he did not regret making the film, commenting about his role as Mr. Freeze and his involvement with the studio, "I felt that the character was interesting and two movies before that one Joel Schumacher was at his height. So the decision-making process was not off. At the same time I was doing "Eraser" over there and Warner Bros. begged me to do the movie." Similarly, 25 years after its theatrical release, Uma Thurman described her work on the film as a "fantastic experience". The nipples seen on the character's costumes, first appearing in "Batman Forever" and accentuated for "Batman & Robin" at Schumacher's request, remain among the most defining aspects of the film. Recounting his involvement with the film, costume designer Jose Fernandez stated that he was opposed to "sharpening" the nipples, calling them "ridiculous". In 2022, Tim Burton commented about Warner Bros.' decision to replace him as director with Schumacher after "Batman Returns", "You complain about me, I'm too weird, I'm too dark, and then you put nipples on the costume? Go fuck yourself." George Clooney's screen-worn suit was put up for auction by Heritage Auctions in 2022 with a starting bid of $40,000. A previous owner had estimated it to be worth $100,000 in 2006 when Clooney was at the height of his career. The suit would go on to sell for $57,500. In the 2009 film "Watchmen", director Zack Snyder and comic book artist Dave Gibbons chose to parody the molded muscle and nipple Batsuit design from "Batman & Robin" for the Ozymandias costume. The film is referenced in the ' episode "Legends of the Dark Mite!", when Bat-Mite briefly uses his powers to transform Batman's costume into the same suit shown in Schumacher's "Batman" films, before declaring it "too icky". Twenty-six years after the release of "Batman & Robin", Clooney made a cameo appearance as Bruce Wayne in the 2023 DC Extended Universe superhero film "The Flash". Clooney was asked to reprise the role when the film was already in post-production, agreeing to join after seeing a cut of the film; filming took place in secret six months before release and lasted half a day. Canceled sequel. During the filming of "Batman & Robin", Warner Bros. was impressed with the dailies, prompting them to immediately hire Joel Schumacher to return as director for a fifth film. However, writer Akiva Goldsman turned down an offer to write the script. In late 1996, Warner Bros. and Schumacher hired Mark Protosevich to write the script for a fifth "Batman" film. A projected mid-1999 release date was announced. "Los Angeles Times" described their film as "continuing in the same vein with multiple villains and more silliness". Titled "Batman Unchained", Protosevich's script featured the Scarecrow as the main villain, who, through the use of his fear toxin, resurrects the Joker as a hallucination in Batman's mind. Harley Quinn would appear as a supporting character, written as the Joker's daughter. Schumacher approached Nicolas Cage to portray the Scarecrow while he was filming "Face/Off" and Courtney Love was considered for Harley Quinn. Schumacher said he begged the studio for him to do "The Dark Knight Returns" story for the "fifth film", but they wanted to keep the "family-friendly, toyethic thing". Clooney, O'Donnell, Silverstone, and Coolio were set to reprise the roles of Batman, Robin, Batgirl, and Scarecrow. It was hoped that the villains from previous films would make cameo appearances in the hallucinations caused by Scarecrow, culminating with Jack Nicholson reprising the role of the Joker. Following the poor critical and financial reception of "Batman & Robin", Clooney vowed never to reprise his role, and Warner Bros. cancelled any future "Batman" films, including Schumacher's planned "Batman Unchained". In a 2012 interview with "Access Hollywood", Chris O'Donnell claimed that a spin-off centered around the character of Robin was planned, but eventually scrapped due to "Batman & Robin"s poor commercial performance.
4730
36449898
https://en.wikipedia.org/wiki?curid=4730
Batman Forever
Batman Forever is a 1995 American superhero film based on the DC Comics character Batman by Bob Kane and Bill Finger. It is the third installment of the "Batman" film series, acting as a standalone sequel to "Batman Returns". Directed by Joel Schumacher and produced by Tim Burton and Peter MacGregor-Scott, it stars Val Kilmer as Bruce Wayne / Batman, replacing Michael Keaton, alongside Tommy Lee Jones, Jim Carrey, Nicole Kidman, and Chris O'Donnell. The film follows Batman as he attempts to prevent Two-Face (Jones) and the Riddler (Carrey) from uncovering his secret identity and extracting information from the minds of Gotham City's residents, while at the same time navigating his feelings for psychologist Dr. Chase Meridian (Kidman) and adopting orphaned acrobat Dick Grayson (O'Donnell)—who becomes his partner and best friend, Robin. Schumacher mostly eschewed the dark, dystopian atmosphere of Burton's films by drawing inspiration from the Batman comic books of the Dick Sprang era, as well as the 1960s television series. After Keaton chose not to reprise his role, William Baldwin and Ethan Hawke were considered as a replacement, before Val Kilmer joined the cast. "Batman Forever" was released on June 16, 1995, to mixed reviews from critics, who praised the visuals, action sequences, and soundtrack, but criticized the screenplay and tonal departure from the previous two films. The film was a box office success, grossing over $336 million worldwide and becoming the sixth-highest-grossing film of 1995. It was followed by "Batman & Robin" in 1997, with Schumacher returning as the director, O'Donnell returning as Robin, and George Clooney replacing Kilmer as Batman. Plot. In Gotham City, Batman defuses a hostage situation orchestrated by the criminal Two-Face, formerly district attorney Harvey Dent, who escapes. Flashbacks reveal that Batman failed to prevent Dent's disfigurement with acid by mobster Sal Maroni, causing Dent to develop a split personality, make decisions based on the flip of a coin, and swear vengeance against Batman. Edward Nygma, an eccentric and egotistical researcher at Wayne Enterprises, approaches his employer, Bruce Wayne, to present an invention that can beam television signals directly into the brain, demanding immediate approval directly from Bruce. Bruce rejects the device, as he is irritated by Nygma and is concerned that the technology could manipulate minds. After killing his abusive supervisor and staging it as a suicide, Nygma resigns and plots revenge against Bruce, sending him riddles. Criminal psychologist Chase Meridian diagnoses Nygma as psychotic. Bruce attends a circus with Chase. Two-Face hijacks the event and threatens to detonate a bomb unless Batman reveals his identity. Dick Grayson, the youngest member of the Flying Graysons family of acrobats, manages to throw the bomb into a river, but Two-Face kills his family in the process. Bruce invites the now-orphaned Dick to live at Wayne Manor as his ward, where he discovers that Bruce is Batman. Seeking to avenge the death of his family, Dick demands to join Batman in crime-fighting, hoping to kill Two-Face, but Bruce declines in order to help Dick move on instead, as he is considering retirement. Nygma becomes the Riddler and teams up with Two-Face. They commit a series of robberies to finance Nygma's new company and mass-produce his brainwave device dubbed the Box, which steals information from minds and transfers it to Nygma's, increasing his intelligence but also slowly causing him to lose his grip on reality. At a party hosted by Nygma, Batman pursues Two-Face and is almost killed until Dick saves him. Batman visits Chase, who explains that she has fallen in love with Bruce, and Bruce reveals his secret identity to her. Having discovered Bruce's secret through the Box, on Halloween night, Two-Face and the Riddler destroy the Batcave, shoot Bruce, and abduct Chase. As Bruce recovers, he and his butler, Alfred Pennyworth, deduce that Nygma is the Riddler. Bruce finally accepts Dick as his best friend and partner, Robin. At the Riddler's lair, Robin defeats Two-Face but chooses to spare him, allowing Two-Face to capture Robin at gunpoint. The Riddler reveals his final riddle: Chase and Robin, representing the two sides of Batman's personality, are trapped in tubes above a deadly drop, and he only has the time to save one. Batman distracts the Riddler with a riddle himself, before destroying the Riddler's brainwave receiver with a Batarang, damaging the Riddler's mind and enabling Batman to rescue both when he sees the floor is an optical illusion. Two-Face corners them and flips his coin to decide their fate, but Batman throws a handful of identical coins in the air, causing Two-Face to fall to his death. Committed to Arkham Asylum, Nygma exclaims that he is Batman, having become completely delusional due to his scrambled memories. Bruce resumes his crusade as Batman, with Robin as his partner. Cast. Additionally, United States Senator and Batman fan Patrick Leahy makes an uncredited appearance as himself. Production. Development. "Batman Returns" was released in 1992 with financial success and generally favorable reviews from critics, but Warner Bros. was disappointed with its box office run, having made $150 million less than the first film. After "Batman Returns" was deemed too dark and inappropriate for children, with McDonald's even recalling their Happy Meal tie-in, Warner Bros. decided that this was the primary cause of the film's financial results. After the film's release, Warner Bros. was not interested in Tim Burton's return as director. Burton noted he was unsure about returning to direct, writing: "I don't think Warner Bros. wanted me to direct a third "Batman". I even said that to them." Burton and Warner Bros. mutually agreed to part ways, though Burton would stay on as producer. John McTiernan turned down an offer to direct. In June 1993, Joel Schumacher was selected by Warner Bros. while he was filming "The Client", and with Burton's approval. Lee and Janet Scott-Batchler, a husband-and-wife screenwriting team, were hired to write the script. Warner Bros. had lost a bidding war for their spec script titled "Smoke and Mirrors" to Disney's Hollywood Pictures. The project ultimately fell through, and Warner Bros. offered the Batchlers several of their film properties to write. Being familiar with the "Batman" comics from their childhood, the Batchlers chose to work on the next "Batman" film as their next project. In a meeting with Burton, they agreed that "the key element to Batman is his duality. And it's not just that Batman is Bruce Wayne". Their original script introduced a psychotic Riddler, real name Lyle Heckendorf, with a pet rat accompanying him. A scene cut from the final film included Heckendorf obtaining his costume from a fortune-telling leprechaun at the circus. Instead of NygmaTech, the company would have been named HeckTech. The story elements and much of the dialogue still remained in the finished film, though Schumacher felt it could be "lighte[ne]d down". Keaton initially approved the selection of Schumacher as director and planned on reprising his role as Batman from the first two films. Schumacher claims he originally had in mind an adaptation of Frank Miller's "" and Keaton claimed that he was enthusiastic about the idea. Warner Bros. rejected the idea as they wanted a sequel, not a prequel, though Schumacher was able to include very brief events in Bruce Wayne's childhood with some events of the comic "The Dark Knight Returns". Akiva Goldsman, who worked with Schumacher on "The Client", was brought in to rewrite the script, deleting the initial idea of bringing in the Scarecrow as a villain with Riddler, and the return of Catwoman. Burton, who now was more interested in directing "Ed Wood", later reflected he was taken aback by some of the focus group meetings for "Batman Forever", a title he hated. Producer Peter MacGregor-Scott represented the studio's aim in making a film for the MTV Generation, with full merchandising appeal. Casting. Production went on fast track with Rene Russo cast as Chase Meridian, but Keaton decided not to reprise Batman because he did not like the direction the series was headed in, and rejected the script. Keaton’s departure was announced in July 1994. Keaton also wanted to pursue "more interesting roles", turning down $15 million. A decision was made to go with a younger actor for Bruce Wayne, and an offer was made to Ethan Hawke, who turned it down, but eventually regretted the decision; he would eventually voice the character in the preschool animated series "Batwheels" in 2022. Schumacher had seen Val Kilmer in "Tombstone", but was also interested in William Baldwin, Ralph Fiennes (who would later voice Alfred Pennyworth in "The Lego Batman Movie" in 2017), and Daniel Day-Lewis. While Burton pushed for Johnny Depp to get the role, Kurt Russell was also considered. Kilmer, who as a child visited the studios where the 1960s series was recorded, and shortly before had visited a bat cave in Africa, was contacted by his agent for the role. Kilmer signed on by July 1994 without reading the script or knowing who the director was. With Kilmer's casting, Warner Bros. dropped Russo, considering her too old to be paired with Kilmer. Jeanne Tripplehorn and Linda Hamilton were considered for the role, which was eventually recast with Nicole Kidman. Kidman later revealed she took the role because she "wanted to kiss Batman." Billy Dee Williams took the role of Harvey Dent in "Batman" on the possibility of portraying Two-Face in a sequel, but Schumacher cast Tommy Lee Jones in the role, although Al Pacino, Clint Eastwood, Martin Sheen and Robert De Niro were considered, after working with him on "The Client". Jones was reluctant to accept the role, but did so at his son's insistence. Robin Williams was in discussions to be the Riddler at one point, and was reportedly in competition for the role with John Malkovich. In June 1994, the role was given to Jim Carrey after Williams had reportedly turned it down. In a 2003 interview, Schumacher stated Michael Jackson had lobbied hard for the role, but was turned down before Carrey was cast. Brad Dourif (who was Burton's original choice to portray the Joker and Scarecrow after), Kelsey Grammer, Micky Dolenz, Matthew Broderick, Phil Hartman and Steve Martin were said to have been considered. Robin had appeared in the shooting script for "Batman Returns" but was deleted due to having too many characters. Marlon Wayans had been cast in the role and signed on for a potential sequel, but when Schumacher took over, he decided to open up casting to other actors. Leonardo DiCaprio was considered, but decided not to pursue the role after a meeting with Schumacher. Among others, Matt Damon, Corey Haim, Corey Feldman, Mark Wahlberg, Michael Worth, Toby Stephens, Ewan McGregor, Jude Law, Alan Cumming and Scott Speedman were considered. Chris O'Donnell was cast and Mitch Gaylord served as his stunt double, and also portrayed Mitch Grayson, Dick's older brother, created for the film. Schumacher attempted to create a cameo role for Bono as his MacPhisto character, but both came to agree it was not suitable for the film. Filming. Principal photography began on September 24, 1994, and wrapped on March 5, 1995. Schumacher hired Barbara Ling for production design, claiming that the film needed a "force" and good design. Ling could "advance on it". Schumacher wanted a design in no way connected to the previous films, and instead inspired by the images from the "Batman" comic books seen in the 1940s/early 1950s and New York City architecture in the 1930s, with a combination of modern Tokyo. He also wanted a "city with personality," with more statues, as well as various amounts of neon. Difficulties and clashes. Schumacher and Kilmer clashed during the making of the film; Schumacher described Kilmer as "childish and impossible," reporting that he fought with various crewmen, and refused to speak to Schumacher for two weeks after the director told him to stop being rude. Schumacher also mentioned Tommy Lee Jones as a source of trouble: "Jim Carrey was a gentleman, and Tommy Lee was threatened by him. I'm tired of defending overpaid, overprivileged actors. I pray I don't work with them again." In a 2014 interview, Carrey acknowledged that Jones was not friendly to him, and recounted an incident wherein Jones told him: "I hate you. I really don't like you ... I cannot sanction your buffoonery." Design and visual effects. Rick Baker designed the prosthetic makeup. John Dykstra, Andrew Adamson and Jim Rygiel served as visual effects supervisors, with Rhythm & Hues Studios (R&H) and Pacific Data Images also contributing to visual effects work. R&H and PDI provided a CGI Batman for complicated stunts. For the costume design, producer Peter MacGregor-Scott claimed that 146 workers were at one point working together. Batman's costume was redesigned along the lines of a more "MTV organic, and edgier feel" to the suit. Sound design and mixing was created and supervised by Bruce Stambler and John Levesque, which included trips to caves to record bat sounds. A new Batmobile was designed for "Batman Forever", with two cars being constructed, one for stunt purposes and one for close-ups. Chris O'Donnell had the area around his eyes painted black and then the Robin mask glued on him. Swiss surrealist painter H. R. Giger provided his version for the Batmobile but it was considered too sinister for the film. The film used some motion capture for certain visual effects. Warner Bros. had acquired motion capture technology from arcade video game company Acclaim Entertainment for use in the film's production. Music. Elliot Goldenthal was hired by Schumacher to compose the film score before the screenplay was written. In discussions with Schumacher, the director wanted Goldenthal to avoid taking inspiration from Danny Elfman, and requested an original composition. The film's promotional teaser trailer however used the main title theme from Elfman's score of 1989's "Batman". The soundtrack was commercially successful, selling almost as many copies as Prince's soundtrack to the 1989 "Batman" film. Only five of the songs on the soundtrack are actually featured in the movie. Hit singles from the soundtrack include "Hold Me, Thrill Me, Kiss Me, Kill Me" by U2 and "Kiss from a Rose" by Seal, both of which were nominated for MTV Movie Awards. "Kiss from a Rose" (whose music video was also directed by Joel Schumacher) reached No. 1 in the U.S. charts as well. The soundtrack itself, featuring additional songs by The Flaming Lips, Brandy (both songs also included in the film), Method Man, Nick Cave, Michael Hutchence (of INXS), PJ Harvey and Massive Attack, was an attempt to (in producer Peter MacGregor-Scott's words) make the film more "pop". Release. Marketing. In addition to a large line of toys, video games and action figures from Kenner, the McDonald's food chain released several collectibles and mugs to coincide with the release of the film. Peter David and Alan Grant wrote separate novelizations of the film. Dennis O'Neil authored a comic book adaptation, with art by Michal Dutkiewicz. Six Flags Great Adventure theme park re-themed their "Axis Chemical" arena, home of the Batman stunt show, to resemble "Batman Forever", and the new show featured props from the film. Six Flags Over Texas featured a one-time fireworks show to promote the movie, and replica busts of Batman, Robin, Two-Face, and the Riddler could be found in the Justice League store in the Looney Tunes U.S.A. section until they were removed in 2023. opened at Six Flags St. Louis to promote the movie. At Six Flags Over Georgia, The Mind Bender roller coaster was redesigned to look as though it were the creation of The Riddler and some images and props were used in the design of the roller coaster and its queue. Video games. Video games based on the film were released. A video game of the same name, was released in 1995 for Super Nintendo Entertainment System, Game Boy, Sega Genesis, Game Gear, R-Zone and MS-DOS, it was followed by "Batman & Robin" for the PlayStation, to promote the release of the film. Two arcade versions, "", was released in 1996 and was ported to the three consoles, and a pinball machine based on the film was released in 1995 by Sega Pinball. Home media. "Batman Forever" was released on VHS and LaserDisc on October 31, 1995. Over 3 million VHS copies were sold during the first week of release. The film was then released on DVD on May 20, 1997. This release was a double sided disc containing both widescreen (1.85:1) and full screen (1.33:1) versions of the film. "Batman Forever" made its Blu-ray debut on April 20, 2010. This was followed by an Ultra HD Blu-ray release on June 4, 2019. Deleted scenes. "Batman Forever" went through a few major edits before its release. Originally darker than the final product, the film's original length was closer to two hours and forty minutes, according to Schumacher. There was talk of an extended cut being released to DVD for the film's tenth anniversary in 2005. While all four previous "Batman" films were given special-edition DVD releases on the same day as the "Batman Begins" DVD release, none of them were given extended cuts, although some scenes were in a deleted scenes section in the special features. Reception. Box office. "Batman Forever" opened in a record 2,842 theaters and 4,300 screens in the United States and Canada on June 16, 1995, grossing $52.8 million in its opening weekend, taking "Jurassic Park"s record for having the highest opening-weekend gross of all time (it was surpassed two years later by 's $72.1 million). For six years, it had the largest opening weekend for a Warner Bros. film until 2001, when it was surpassed by "Harry Potter and the Sorcerer's Stone". The film also achieved the highest June opening weekend, holding that record until it was beaten by ' in 1999, which would then be overtaken by "Hulk" in 2003. It was the first film to gross $20 million in one day, on its opening day on Friday. The film also beat out "Congo" to reach the number one spot. It grossed $77.4 million in its first week, which was below the record $81.7 million set by "Jurassic Park". Additionally, the film held the record for having the highest opening weekend for a superhero film until it was taken by "X-Men" in 2000. That year, "How the Grinch Stole Christmas" broke "Batman Forever"s record for scoring the biggest opening weekend for any film starring Jim Carrey. While the film was overtaken by "Pocahontas" during its second weekend, it still made $29.2 million. It then became the first film of 1995 to reach $100 million domestically. The film started its international roll out in Japan on June 17, 1995, and grossed $2.2 million in 5 days from 167 screens, which was only 80% of the gross of its predecessor "Batman Returns". The film went on to gross $184 million in the United States and Canada, and $152.5 million in other countries, totaling $336.53 million. The film grossed more than "Batman Returns", and is the second-highest-grossing film from 1995 in the United States, behind "Toy Story", as well as the sixth-highest-grossing film of that year worldwide. Critical response. On Rotten Tomatoes, "Batman Forever" has an approval rating of 41% based on 73 reviews, with an average rating of 5.1/10. The site's critical consensus reads: "Loud, excessively busy, and often boring, "Batman Forever" nonetheless has the charisma of Jim Carrey and Tommy Lee Jones to offer mild relief." On Metacritic, the film has a weighted average score of 51 out of 100, based on 23 critics, indicating "mixed or average" reviews. Audiences polled by CinemaScore gave the film an average grade of "A−" on an A+ to F scale. Peter Travers of "Rolling Stone" wrote: ""Batman Forever" still gets in its licks. There's no fun machine this summer that packs more surprises." Travers criticized the film's excessive commercialism and felt that "the script misses the pain Tim Burton caught in a man tormented by the long-ago murder of his parents", but praised Kilmer's performance as having a "deftly understated [...] comic edge". James Berardinelli of "ReelViews" enjoyed the film, writing: "It's lighter, brighter, funnier, faster-paced, and a whole lot more colorful than before." On the television program "Siskel & Ebert", Gene Siskel of the "Chicago Tribune" and Roger Ebert of the "Chicago Sun-Times" both gave the film mixed reviews, but with the former giving it a thumbs up and the latter a thumbs down. In his written review, Ebert wrote: "Is the movie better entertainment? Well, it's great bubblegum for the eyes. Younger children will be able to process it more easily; some kids were led bawling from "Batman Returns" where the PG-13 rating was a joke." Mick LaSalle of the "San Francisco Chronicle" had a mixed reaction, concluding: "a shot of Kilmer's rubber buns at one point is guaranteed to bring squeals from the audience." Brian Lowry of "Variety" believed: "One does have to question the logic behind adding nipples to the hard-rubber batsuit. Whose idea was that supposed to be anyway, Alfred's? Some of the computer-generated Gotham cityscapes appear too obviously fake. Elliot Goldenthal's score, while serviceable, also isn't as stirring as Danny Elfman's work in the first two films." Some observers thought Schumacher, a gay man, added possible homoerotic innuendo in the storyline. Regarding the costume design, Schumacher stated: "I had no idea that putting nipples on the Batsuit and Robin suit were going to spark international headlines. The bodies of the suits come from Ancient Greek statues, which display perfect bodies. They are anatomically correct." O'Donnell felt: "it wasn't so much the nipples that bothered me. It was the codpiece. The press obviously played it up and made it a big deal, especially with Joel directing. I didn't think twice about the controversy, but going back and looking and seeing some of the pictures, it was very unusual." Accolades. At the 68th Academy Awards, "Batman Forever" was nominated for Cinematography (lost to "Braveheart"), Sound (Donald O. Mitchell, Frank A. Montaño, Michael Herbick and Petur Hliddal; lost to "Apollo 13") and Sound Effects Editing (John Leveque and Bruce Stambler; also lost to "Braveheart"). "Hold Me, Thrill Me, Kiss Me, Kill Me" by U2 was nominated for the Golden Globe Award for Best Original Song (lost to "Colors of the Wind" from "Pocahontas"), but was also nominated for the Golden Raspberry Award for Worst Original Song (lost to "Walk into the Wind" from "Showgirls"). At the 22nd Saturn Awards, the film was nominated for Best Fantasy Film (lost to "Babe"), Make-up (lost to "Seven"), Special Effects (lost to "Jumanji") and Costume Design (lost to "12 Monkeys"). Composer Elliot Goldenthal was given a Grammy Award nomination. "Batman Forever" received six nominations at the 1996 MTV Movie Awards, four of which were divided between two categories (Carrey and Lee Jones for Best Villain; and Seal's "Kiss from a Rose" and U2's "Hold Me" in Best Song from a Movie). However, it won in just one category: Best Song from a Movie for Seal's "Kiss from a Rose". Legacy. Potential director's cut. Cuts were made to the film based on audience reactions during test screenings, like the rest of the "Batman" films. Photographs from these scenes have always been available since the film's release, shown in magazines such as "Starlog". Some excerpts from these scenes appear in the music video for "Hold Me, Thrill Me, Kiss Me, Kill Me". In 2005, "Batman Forever" was the only film in the franchise to include a dedicated deleted scenes selection among its bonus content on the special edition DVD. After Joel Schumacher died on June 22, 2020, media outlets started reporting the possible existence of an extended cut, with the first rumors being thrown in by American journalist Marc Bernardin. Bernardin claimed it to be darker and contain less camp than the theatrical cut. Some of the differences include Bruce having a vision of a human-sized bat, less of an emphasis on Dick Grayson, and a focus on Bruce's psychological issues with Chase. The cut uses about 50 minutes of additional footage. Warner Bros. confirmed that alternative test screening cuts existed after an interview with "Variety", although they have no plans to release it and are unsure about what, if any, footage remains. Later that year on August 7, Kilmer's appearance at DC FanDome fueled fan speculation about the release of a so-called "Schumacher Cut". "Batman Forever" screenwriter Akiva Goldsman revealed in a YouTube interview in April 2021 that he had recently seen the original cut of the film (dubbed "Preview Cut: One") and that he expects a rebirth coming up, suggesting all the footage needed to make the Schumacher cut still exists and that the release of a director's cut might be possible. In July 2023, following a private screening of a workprint version by director Kevin Smith, Goldsman confirmed that the original cut does exist and even though Warner Bros. currently has no plans to release it, he said he was hopeful for a possible distribution in the future. Some of the aforementioned deleted scenes make up a portion of this footage. In July 2024, Goldsman reaffirmed the existence of the director's cut, while also declaring that work to restore it has been put on hold following Warner Bros. recent internal turmoils. In May 2025, an independent Los Angeles theater announced that a workprint of the director's cut would be screened at their venue later that month, but on May 24th the screening was cancelled following a cease and desist letter from Warner Bros. In July 2025, during an interview about his career Goldsman declared that while Warner Bros isn't currently interested in releasing the director's cut, he's still lobbying for it. "Batman '89". An alternate six-issue comic book continuation of "Batman Returns" titled "Batman '89", which ignores the events of "Batman Forever" and "Batman & Robin" and brings back Keaton's Batman along with Burton's dark setting seen in his first two Batman films, along with elements of his failed third "Batman" film (particularly, the return of Billy Dee Williams' Harvey Dent and transformation into Two-Face, the introductions of new versions of Robin and Barbara Gordon, and the return of Catwoman), was launched on August 10, 2021, with its issues releasing monthly before ending in July 2022. In response to a question as to whether Schumacher's Batman films are canon to the world of "Batman '89", the first two films' screenwriter Sam Hamm, who also serves as the comics' writer, confirmed that the latter two films take place in a diverging timeline and they are not building toward that fate.
4733
7007500
https://en.wikipedia.org/wiki?curid=4733
Bidirectional text
A bidirectional text contains two text directionalities, right-to-left (RTL) and left-to-right (LTR). It generally involves text containing different types of alphabets, but may also refer to boustrophedon, which is changing text direction in each row. An example is the RTL Hebrew name Sarah: , spelled sin (ש) on the right, resh (ר) in the middle, and heh (ה) on the left. Many computer programs failed to display this correctly, because they were designed to display text in one direction only. Some so-called right-to-left scripts such as the Persian script and Arabic are mostly, but not exclusively, right-to-left—mathematical expressions, numeric dates and numbers bearing units are embedded from left to right. That also happens if text from a left-to-right language such as English is embedded in them; or vice versa, if Arabic is embedded in a left-to-right script such as English. Bidirectional script support. Bidirectional script support is the capability of a computer system to correctly display bidirectional text. The term is often shortened to "BiDi" or "bidi". Early computer installations were designed only to support a single writing system, typically for left-to-right scripts based on the Latin alphabet only. Adding new character sets and character encodings enabled a number of other left-to-right scripts to be supported, but did not easily support right-to-left scripts such as Arabic or Hebrew, and mixing the two was not practical. Right-to-left scripts were introduced through encodings like ISO/IEC 8859-6 and ISO/IEC 8859-8, storing the letters (usually) in writing and reading order. It is possible to simply flip the left-to-right display order to a right-to-left display order, but doing this sacrifices the ability to correctly display left-to-right scripts. With bidirectional script support, it is possible to mix characters from different scripts on the same page, regardless of writing direction. In particular, the Unicode standard provides foundations for complete BiDi support, with detailed rules as to how mixtures of left-to-right and right-to-left scripts are to be encoded and displayed. Unicode bidi support. The Unicode standard calls for characters to be ordered 'logically', i.e. in the sequence they are intended to be interpreted, as opposed to 'visually', the sequence they appear. This distinction is relevant for bidi support because at any bidi transition, the visual presentation ceases to be the 'logical' one. Thus, in order to offer bidi support, Unicode prescribes an algorithm for how to convert the logical sequence of characters into the correct visual presentation. For this purpose, the Unicode encoding standard divides all its characters into one of four types: 'strong', 'weak', 'neutral', and 'explicit formatting'. Strong characters. Strong characters are those with a definite direction. Examples of this type of character include most alphabetic characters, syllabic characters, Han ideographs, non-European or non-Arabic digits, and punctuation characters that are specific to only those scripts. Weak characters. Weak characters are those with vague direction. Examples of this type of character include European digits, Eastern Arabic-Indic digits, arithmetic symbols, and currency symbols. Neutral characters. Neutral characters have direction indeterminable without context. Examples include paragraph separators, tabs, and most other whitespace characters. Punctuation symbols that are common to many scripts, such as the colon, comma, full-stop, and the no-break-space also fall within this category. Explicit formatting. Explicit formatting characters, also referred to as "directional formatting characters", are special Unicode sequences that direct the algorithm to modify its default behavior. These characters are subdivided into "marks", "embeddings", "isolates", and "overrides". Their effects continue until the occurrence of either a paragraph separator, or a "pop" character. Marks. If a "weak" character is followed by another "weak" character, the algorithm will look at the first neighbouring "strong" character. Sometimes this leads to unintentional display errors. These errors are corrected or prevented with "pseudo-strong" characters. Such Unicode control characters are called "marks". The mark ( or ) is to be inserted into a location to make an enclosed weak character inherit its writing direction. For example, to correctly display the for an English name brand (LTR) in an Arabic (RTL) passage, an LRM mark is inserted after the trademark symbol if the symbol is not followed by LTR text (e.g. "). If the LRM mark is not added, the weak character ™ will be neighbored by a strong LTR character and a strong RTL character. Hence, in an RTL context, it will be considered to be RTL, and displayed in an incorrect order (e.g. "). Embeddings. The "embedding" directional formatting characters are the classical Unicode method of explicit formatting, and as of Unicode 6.3, are being discouraged in favor of "isolates". An "embedding" signals that a piece of text is to be treated as directionally distinct. The text within the scope of the embedding formatting characters is not independent of the surrounding text. Also, characters within an embedding can affect the ordering of characters outside. Unicode 6.3 recognized that directional embeddings usually have too strong an effect on their surroundings and are thus unnecessarily difficult to use. Isolates. The "isolate" directional formatting characters signal that a piece of text is to be treated as directionally isolated from its surroundings. As of Unicode 6.3, these are the formatting characters that are being encouraged in new documents – once target platforms are known to support them. These formatting characters were introduced after it became apparent that directional embeddings usually have too strong an effect on their surroundings and are thus unnecessarily difficult to use. Unlike the legacy 'embedding' directional formatting characters, 'isolate' characters have no effect on the ordering of the text outside their scope. Isolates can be nested, and may be placed within embeddings and overrides. Overrides. The "override" directional formatting characters allow for special cases, such as for part numbers (e.g. to force a part number made of mixed English, digits and Hebrew letters to be written from right to left), and are recommended to be avoided wherever possible. As is true of the other directional formatting characters, "overrides" can be nested one inside another, and in embeddings and isolates. Using Unicode to override. Using will switch the text direction from left-to-right to right-to-left. Similarly, using will switch the text direction from right-to-left to left-to-right. Refer to the Unicode Bidirectional Algorithm. Pops. The "pop" directional formatting character, encoded at , terminates the scope of the most recent "embedding", "override", or "isolate". Runs. In the algorithm, each sequence of concatenated strong characters is called a "run". A "weak" character that is located between two "strong" characters with the same orientation will inherit their orientation. A "weak" character that is located between two "strong" characters with a different writing direction will inherit the main context's writing direction (in an LTR document the character will become LTR, in an RTL document, it will become RTL). Security. Unicode bidirectional characters are used in the Trojan Source vulnerability. Visual Studio Code highlights BiDi control characters since version 1.62 released in October 2021. Visual Studio highlights BiDi control characters since version 17.0.3 released on December 14, 2021. Scripts using bidirectional text. Egyptian hieroglyphs. Egyptian hieroglyphs were written bidirectionally, where the signs that had a distinct "head" or "tail" faced the beginning of the line. Chinese characters and other CJK scripts. Chinese characters can be written in either direction as well as vertically (top to bottom then right to left), especially in signs (such as plaques), but the orientation of the individual characters does not change. This can often be seen on tour buses in China, where the company name customarily runs from the front of the vehicle to its rear — that is, from right to left on the right side of the bus, and from left to right on the left side of the bus. English texts on the right side of the vehicle are also quite commonly written in reverse order. (See pictures of tour bus and post vehicle below.) Likewise, other CJK scripts made up of the same square characters, such as the Japanese writing system and Korean writing system, can also be written in any direction, although horizontally left-to-right, top-to-bottom and vertically top-to-bottom right-to-left are the two most common forms. Boustrophedon. Boustrophedon is a writing style found in ancient Greek inscriptions, in Old Sabaic (an Old South Arabian language) and in Hungarian runes. This method of writing alternates direction, and usually reverses the individual characters, on each successive line. Moon type. Moon type is an embossed adaptation of the Latin alphabet invented as a tactile alphabet for the blind. Initially the text changed direction (but not character orientation) at the end of the lines. Special embossed lines connected the end of a line and the beginning of the next. Around 1990, it changed to a left-to-right orientation.
4734
28481209
https://en.wikipedia.org/wiki?curid=4734
Bernoulli's inequality
In mathematics, Bernoulli's inequality (named after Jacob Bernoulli) is an inequality that approximates exponentiations of formula_1. It is often employed in real analysis. It has several useful variants: History. Jacob Bernoulli first published the inequality in his treatise "Positiones Arithmeticae de Seriebus Infinitis" (Basel, 1689), where he used the inequality often. According to Joseph E. Hofmann, Über die Exercitatio Geometrica des M. A. Ricci (1963), p. 177, the inequality is actually due to Sluse in his Mesolabum (1668 edition), Chapter IV "De maximis & minimis". Proof for integer exponent. The first case has a simple inductive proof: Suppose the statement is true for formula_21: formula_22 Then it follows that formula_23 Bernoulli's inequality can be proved for case 2, in which formula_24 is a non-negative integer and formula_25, using mathematical induction in the following form: For formula_28, formula_29 is equivalent to formula_30 which is true. Similarly, for formula_31 we have formula_32 Now suppose the statement is true for formula_21: formula_22 Then it follows that formula_35 since formula_36 as well as formula_37. By the modified induction we conclude the statement is true for every non-negative integer formula_24. By noting that if formula_39, then formula_40 is negative gives case 3. Generalizations. Generalization of exponent. The exponent formula_24 can be generalized to an arbitrary real number as follows: if formula_42, then formula_2 for formula_44 or formula_45, and formula_18 for formula_19. This generalization can be proved by convexity (see below) or by comparing derivatives. The strict versions of these inequalities require formula_5 and formula_49. The case formula_50 can also be derived from the case formula_3 by noting that (using the main case result) formula_52 and by using the fact that formula_53 is monotonic. We can conclude that formula_54 for formula_55, therefore formula_56 for formula_57. The leftover case formula_58 is verified separately. Generalization of base. Instead of formula_59 the inequality holds also in the form formula_60 where formula_61 are real numbers, all greater than formula_62, all with the same sign. Bernoulli's inequality is a special case when formula_63. This generalized inequality can be proved by mathematical induction. In the first step we take formula_64. In this case the inequality formula_65 is obviously true. In the second step we assume validity of the inequality for formula_24 numbers and deduce validity for formula_67 numbers. We assume thatformula_68is valid. After multiplying both sides with a positive number formula_69 we get: formula_70 As formula_71 all have the same sign, the products formula_72 are all positive numbers. So the quantity on the right-hand side can be bounded as follows:formula_73what was to be shown. Strengthened version. The following theorem presents a strengthened version of the Bernoulli inequality, incorporating additional terms to refine the estimate under specific conditions. Let the expoent formula_24 be a nonnegative integer and let formula_12 be a real number with formula_76 if formula_24 is odd and greater than 1. Then formula_78 with equality if and only if formula_79 or formula_80. Related inequalities. The following inequality estimates the formula_24-th power of formula_1 from the other side. For any real numbers formula_12 and formula_24 with formula_85, one has formula_86 where formula_87 2.718... This may be proved using the inequality formula_88 Alternative form. An alternative form of Bernoulli's inequality for formula_89 and formula_90 is: formula_91 This can be proved (for any integer formula_92) by using the formula for geometric series: (using formula_93) formula_94 or equivalently formula_95 Alternative proofs. Arithmetic and geometric means. An elementary proof for formula_96 and formula_97 can be given using weighted AM-GM. Let formula_98 be two non-negative real constants. By weighted AM-GM on formula_99 with weights formula_98 respectively, we get formula_101 Note that formula_102 and formula_103 so our inequality is equivalent to formula_104 After substituting formula_105 (bearing in mind that this implies formula_96) our inequality turns into formula_107 which is Bernoulli's inequality for formula_96. The case formula_109 can be derived from formula_96 in the same way as the case formula_96 can be derived from formula_109, see above "Generalization of exponent". Geometric series. Bernoulli's inequality is equivalent to and by the formula for geometric series (using "y" = 1 + "x") we get which leads to Now if formula_113 then by monotony of the powers each summand formula_114, and therefore their sum is greater formula_115 and hence the product on the LHS of (). If formula_116 then by the same arguments formula_117 and thus all addends formula_118 are non-positive and hence so is their sum. Since the product of two non-positive numbers is non-negative, we get again Binomial theorem. One can prove Bernoulli's inequality for "x" ≥ 0 using the binomial theorem. It is true trivially for "r" = 0, so suppose "r" is a positive integer. Then formula_119 Clearly formula_120 and hence formula_121 as required. Using convexity. For formula_122 the function formula_123 is strictly convex. Therefore, for formula_124 holds formula_125 and the reversed inequality is valid for formula_126 and formula_127. Another way of using convexity is to re-cast the desired inequality to formula_128 for real formula_3 and real formula_130. This inequality can be proved using the fact that the formula_131 function is concave, and then using Jensen's inequality in the form formula_132 to give: formula_133 which is the desired inequality.
4735
10289486
https://en.wikipedia.org/wiki?curid=4735
Benjamin Franklin-class submarine
The "Benjamin Franklin" class of US ballistic missile submarines were in service from the 1960s to the 2000s. The class was an evolutionary development from the earlier of fleet ballistic missile submarine. Having quieter machinery and other improvements, it is considered a separate class. A subset of this class is the re-engineered 640 class starting with . The primary difference was that they were built under the new SUBSAFE rules after the loss of , earlier boats of the class had to be retrofitted to meet SUBSAFE requirements. The "Benjamin Franklin" class, together with the , , , and "James Madison" classes, comprised the "41 for Freedom" submarines that were the Navy's primary contribution to the nuclear deterrent force through the late 1980s. This class and the "James Madison" class are combined with the "Lafayette"s in some references. Design. The "Benjamin Franklin"-class submarines were built with the Polaris A-3 ballistic missile, and in the early 1970s were converted to carry the Poseidon C-3 missile. During the late 1970s and early 1980s, six boats were further modified to carry the Trident I (C-4) missile, along with six "James Madison"-class boats. These were "Benjamin Franklin", "Simon Bolivar", "George Bancroft", "Henry L. Stimson", "Francis Scott Key", and "Mariano G. Vallejo". In response to the loss of in April 1963, this class was designed to SUBSAFE standards and its equipment was similar to the fast attack submarines (SSNs). Previous US SSBNs except the "George Washington" class had equipment similar to the SSNs. This class can be distinguished by the fairwater planes' location halfway up the sail; the "Lafayette"s and "James Madison"s had the fairwater planes in the upper front portion of the sail. Two submarines of this class were converted for delivery of up to 66 SEALs or other Special Operations Forces each. In the early 1990s, to make room for the ballistic missile submarines within the limits set by the SALT II strategic arms limitation treaty, the ballistic missile tubes of and were disabled. Those boats were redesignated special operations attack submarines and given attack submarine (SSN) hull classification symbols. They were equipped with dry deck shelters to accommodate SEAL Delivery Vehicles or other equipment. Fate. The "Benjamin Franklin"s were decommissioned between 1992 and 2002 to comply with SALT II treaty limitations as the SSBNs entered service, for their age, and because of the collapse of the Soviet Union. USS "Kamehameha" was decommissioned on 2 April 2002, the last ship of the "Benjamin Franklin" class to be decommissioned. The sail of "George Bancroft" is preserved at the Naval Submarine Base King's Bay, Georgia. "James K. Polk"s sail is on display at the National Museum of Nuclear Science & History in Albuquerque, New Mexico. "Mariano G. Vallejo"s sail is preserved at Mare Island, California, where she was built. The sail of "Lewis and Clark" is on display at the Patriot's Point Maritime Museum in Charleston, South Carolina.
4736
87007
https://en.wikipedia.org/wiki?curid=4736
Bastard Operator From Hell
The Bastard Operator From Hell (BOFH) is a fictional rogue computer operator created by Simon Travaglia, who takes out his anger on users (who are "lusers" to him) and others who pester him with their computer problems, uses his expertise against his enemies and manipulates his employer. Several people have written stories about BOFHs, but only those by Simon Travaglia are considered canonical. The BOFH stories were originally posted in 1992 to Usenet by Travaglia, with some being reprinted in "Datamation." Since 2000 they have been published regularly in "The Register" (UK). Several collections of the stories have been published as books. By extension, the term is also used to refer to any system administrator who displays the qualities of the original. The early accounts of the BOFH took place in a university; later the scenes were set in an office workplace. In 2000 (BOFH 2k), the BOFH and his pimply-faced youth (PFY) assistant moved to a new company. Influence. The protagonist in Charles Stross's "The Laundry Files" series of novels named himself Bob Oliver Francis Howard in reference to the BOFH. As Bob Howard is a self-chosen pseudonym, and Bob is a network manager when not working as a computational demonologist, the name is all too appropriate. In the novella "Pimpf", he acquires a pimply-faced young assistant by the name of Peter-Fred Young. BOFH is a text adventure game written by Howard A. Sherman, which took part in the 2002 Interactive Fiction Competition and was placed 26th out of 38. Simon Travaglia. Simon Travaglia (born 1964) graduated from the University of Waikato, New Zealand in 1985. He worked as the IT infrastructure manager (2004–2008) and computer operator (1985–1992) at the University of Waikato and the infrastructure manager at the Waikato Innovation Park, Hamilton, New Zealand (since 2008). Since 1999 he is a freelance writer for "The Register". He lives in Hautapu, New Zealand.
4737
20187702
https://en.wikipedia.org/wiki?curid=4737
Brownie McGhee
Walter Brown "Brownie" McGhee (November 30, 1915 – February 16, 1996) was an American folk and Piedmont blues singer and guitarist, best known for his collaboration with the harmonica player Sonny Terry. Life and career. McGhee was born in Knoxville, Tennessee, and grew up in Kingsport, Tennessee. At about the age of four he contracted polio, which severely limited use of his right leg. His brother Granville "Stick" McGhee, who also later became a musician and composed the song "Drinkin' Wine Spo-Dee-O-Dee", was nicknamed for pushing young Brownie around in a cart. Their father, George McGhee, was a factory worker, known around University Avenue for playing guitar and singing. Brownie's uncle made him a guitar from a tin marshmallow box and a piece of board. McGhee spent much of his youth immersed in music, singing with a local harmony group, the Golden Voices Gospel Quartet, and teaching himself to play guitar. He also played the five-string banjo and ukulele and studied piano. Surgery funded by the March of Dimes enabled McGhee to walk. At the age of 22, McGhee became a traveling musician, working in the Rabbit Foot Minstrels and befriending Blind Boy Fuller, whose guitar playing influenced him greatly. After Fuller's death in 1941, J. B. Long of Columbia Records promoted McGhee as "Blind Boy Fuller No. 2". By that time, McGhee was recording for Columbia's subsidiary Okeh Records in Chicago, but his real success came after he moved to New York in 1942, when he teamed up with Sonny Terry, whom he had known since 1939, when Terry was Fuller's harmonica player. The pairing was an overnight success. They recorded and toured together until around 1980. As a duo, Terry and McGhee did most of their work from 1958 until 1980, spending 11 months of each year touring and recording dozens of albums. Despite their later fame as "pure" folk artists playing for white audiences, in the 1940s Terry and McGhee had attempted to be successful recording artists, fronting a jump blues combo with honking saxophone and rolling piano, variously calling themselves "Brownie McGhee and his Jook House Rockers" or "Sonny Terry and his Buckshot Five", often with Champion Jack Dupree and Big Chief Ellis. They also appeared in the original Broadway productions of "Finian's Rainbow" and "Cat on a Hot Tin Roof". During the blues revival of the 1960s, Terry and McGhee were popular on the concert and music festival circuits, occasionally adding new material but usually remaining faithful to their roots and playing to the tastes of their audiences. Late in his life, McGhee appeared in small roles in films and on television. He and Terry appeared in the 1979 Steve Martin comedy "The Jerk". In 1987, McGhee gave a small but memorable performance as the ill-fated blues singer Toots Sweet in the supernatural thriller movie "Angel Heart." In his review of "Angel Heart", the critic Roger Ebert singled out McGhee for praise, declaring that he delivered a "performance that proves [saxophonist] Dexter Gordon isn't the only old musician who can act." McGhee appeared in the television series "Family Ties", in a 1988 episode entitled "The Blues, Brother", in which he played the fictional blues musician Eddie Dupre. He also appeared in the television series "Matlock", in a 1989 episode entitled "The Blues Singer", providing the singing voice and guitar playing for Joe Seneca's character who is accused of murder. In the episode, McGhee, Seneca and star Andy Griffith perform a version of "The Midnight Special". Happy Traum, a former guitar student of McGhee's, edited a blues guitar instruction guide and songbook, "Guitar Styles of Brownie McGhee", published in 1971, in which McGhee, between lessons, talked about his life and the blues. The autobiographical section features McGhee talking about growing up, his musical beginnings, and a history of the blues from the 1930s onward. McGhee and Terry were both recipients of a 1982 National Heritage Fellowship awarded by the National Endowment for the Arts, which is the United States government's highest honor in the folk and traditional arts. That year's fellowships were the first bestowed by the NEA. One of McGhee's last concert appearances was at the 1995 Chicago Blues Festival. He died of stomach cancer on February 16, 1996, in Oakland, California, at the age of 80. McGhee was portrayed by Joshua Henry in the 2024 film "A Complete Unknown".
4739
1301391787
https://en.wikipedia.org/wiki?curid=4739
International Bureau of Weights and Measures
The International Bureau of Weights and Measures (, BIPM) is an intergovernmental organisation, through which its 64 member-states act on measurement standards in areas including chemistry, ionising radiation, physical metrology, as well as the International System of Units (SI) and Coordinated Universal Time (UTC). It is headquartered in the Pavillon de Breteuil in Saint-Cloud, near Paris, France. The organisation has been referred to as IBWM (from its name in English) in older literature. Function. The BIPM has the mandate to provide the basis for a single, coherent system of measurements throughout the world, traceable to the International System of Units (SI). This task takes many forms, from direct dissemination of units to coordination through international comparisons of national measurement standards (as in electricity and ionising radiation). Following consultation, a draft version of the BIPM Work Programme is presented at each meeting of the General Conference for consideration with the BIPM budget. The final programme of work is determined by the CIPM in accordance with the budget agreed to by the CGPM. Currently, the BIPM's main work includes: The BIPM is one of the twelve member organisations of the International Network on Quality Infrastructure (INetQI), which promotes and implements QI activities in metrology, accreditation, standardisation and conformity assessment. The BIPM has an important role in maintaining accurate worldwide time of day. It combines, analyses, and averages the official atomic time standards of member nations around the world to create a single, official Coordinated Universal Time (UTC). Structure. The BIPM is overseen by the International Committee for Weights and Measures (), a committee of eighteen members that meet normally in two sessions per year, which is in turn overseen by the General Conference on Weights and Measures () that meets in Paris usually once every four years, consisting of delegates of the governments of the Member States and observers from the Associates of the CGPM. These organs are also commonly referred to by their French initialisms. History. The creation of the International Bureau of Weights and Measures followed the Metre Convention of 1875, after the Franco-Prussian War (1870–1871), at the initiative of the International Geodetic Association. This process began with the 1855 Paris Exposition, shortly after the Great Exhibition, when the need for international standardization of weights and measures became apparent. During the process of unification of Germany, geodesists called for the establishment of an International Bureau of Weights and Measures in Europe. These trends culminated in the 1889 General Conference on Weights and Measures, with the distribution of the metre and kilogram standards to the States parties to the Metre Convention. When the metre was adopted as an international unit of length, it was well known that it no longer corresponded to its historical definition. Carlos Ibáñez e Ibáñez de Ibero, first president of both the International Geodetic Association and the International Committee for Weigths and Measures, took part to the remeasurement and extension of the arc measurement of Delambre and Méchain. At that time, mathematicians like Legendre and Gauss had developed new methods for processing data, including the least squares method which allowed to compare experimental data tainted with observational errors to a mathematical model. Moreover, the International Bureau of Weights and Measures would have a central role for international geodetic measurements as Charles Édouard Guillaume's discovery of invar minimized the impact of measurement inaccuracies due to temperature systematic errors. Geodetic standards and the Expositions Universelles (1855 /1867). In the 19th century, units of measurement were defined by primary standards, and unique artifacts made of different alloys with distinct coefficients of expansion were the legal basis of units of length. A wrought iron ruler, the Toise of Peru, also called "Toise de l'Académie", was the French primary standard of the toise, and the metre was officially defined by an artifact made of platinum kept in the National Archives. Besides the latter, another platinum and twelve iron standards of the metre were made by Étienne Lenoir in 1799. One of them became known as the Committee Meter in the United States and served as standard of length in the United States Coast Survey until 1890. In 1855, the Dufour map (French: "Carte Dufour"), the first topographic map of Switzerland for which the metre was adopted as the unit of length, won the gold medal at the Exposition Universelle. However, the baselines for this map were measured in 1834 with three toises long measuring rods calibrated on a toise made in 1821 by Jean Nicolas Fortin for Friedrich Georg Wilhelm von Struve. The geodetic measuring device calibrated on the metre devised by Carlos Ibáñez e Ibáñez de Ibero and Frutos Saavedra Meneses, was displayed by Jean Brunner at the Exhibition. The four-metre-long Spanish measuring instrument, which became known as the Spanish Standard (French: "Règle espagnole"), was compared with Borda's double-toise N° 1, which served as a comparison module for the measurement of all geodesic bases in France, and was also to be compared to the Ibáñez apparatus. In order to maintain measurement traceability it was important to control the temperature during these intercomparisons in order to avoid systematic errors. The Earth measurements thus underscored the importance of scientific methods at a time when statistics were implemented in geodesy. As a leading scientist of his time, Carlos Ibáñez e Ibáñez de Ibero was one of the 81 initial members of the International Statistical Institute (ISI) and delegate of Spain to the first ISI session (now called World Statistic Congress) in Rome in 1887. On the sidelines of the Exposition Universelle (1855) and the second Congress of Statistics held in Paris, an association with a view to obtaining a uniform decimal system of measures, weights and currencies was created in 1855. Under the impetus of this association, a Committee for Weights and Measures and Monies (French: "Comité des poids, mesures et monnaies") would be created during the Exposition Universelle (1867) in Paris and would call for the international adoption of the metric system. The metre and Struve Geodetic Arc (1816/1855). In 1858, a Technical Commission was set up to continue cadastral surveying inaugurated under Muhammad Ali. This Commission suggested to buy geodetic devices which were ordered in France. Mohammed Sa'id Pasha entrusted to Ismail Mustafa al-Falaki the study of the precision apparatus calibrated against the metre intended to measure geodetic baselines and built by Jean Brunner in Paris. Ismail Mustafa had the task to carry out the experiments necessary for determining the expansion coefficients of the two platinum and brass bars, and to compare the Egyptian standard with a known standard. The Spanish standard designed by Carlos Ibáñez e Ibáñez de Ibero and Frutos Saavedra Meneses was chosen for this purpose, as it had served as a model for the construction of the Egyptian standard. It was not until 1954 that the connection of the southerly extension of the Struve Geodetic Arc, a chain of survey triangulations stretching from Hammerfest in Norway to the Black Sea, with an arc running northwards from South Africa through Egypt would bring the course of a major meridian arc back to land where Eratosthenes had founded geodesy. The Struve Geodetic arc measurement extended on a period of forty years and initiated an international scientific collaboration between Russian Empire and the United Kingdoms of Sweden and Norway with the involvement of proeminent astronomers such as Friedrich Georg Wilhelm von Struve, Friedrich Wilhelm Bessel, Carl Friedrich Gauss and George Biddell Airy. A French scientific instrument maker, Jean Nicolas Fortin, made three direct copies of the Toise of Peru, one for Friedrich Georg Wilhelm von Struve, a second for Heinrich Christian Schumacher in 1821 and a third for Friedrich Wilhelm Bessel in 1823. In 1831, Henri-Prudence Gambey also realised a copy of the Toise of Peru which was kept at Altona Observatory in Hamburg. According to geodesists, these standards were secondary standards deduced from the Toise of Peru. In continental Europe, except Spain, surveyors continued to use measuring instruments calibrated on the Toise of Peru. Among these, the toise of Bessel and the apparatus of Borda were respectively the main references for geodesy in Prussia and in France. These measuring devices consisted of bimetallic rulers in platinum and brass or iron and zinc fixed together at one extremity to assess the variations in length produced by any change in temperature. The combination of two bars made of two different metals allowed to take thermal expansion into account without measuring the temperature. Metric act of 1866 and calls for an international standard unit of length. In 1866, Ferdinand Rudolph Hassler's use of the metre and the creation of the Office of Standard Weights and Measures as an office within the Coast Survey contributed to the introduction of the Metric Act of 1866 allowing the use of the metre in the United States, and preceded the choice of the metre as international scientific unit of length and the proposal by the 1867 General Conference of the European Arc Measurement (German: "Europäische Gradmessung") to establish the International Bureau of Weights and Measures. Moreover, it was asserted that the Toise of Peru, the standard of the toise constructed in 1735 for the French Geodesic Mission to the Equator, might be so much damaged that comparison with it would be worthless, while Bessel had questioned the accuracy of copies of this standard belonging to Altona and Koenigsberg Observatories, which he had compared to each other about 1840. This assertion was particularly worrying, because when the primary Imperial yard standard had partially been destroyed in 1834, a new standard of reference was constructed using copies of the "Standard Yard, 1760", instead of the pendulum's length as provided for in the Weights and Measures Act 1824, because the pendulum method proved unreliable. International Geodetic Association. The intimate relationships that necessarily existed between metrology and geodesy explain that the International Association of Geodesy, founded to combine the geodetic operations of different countries, in order to reach a new and more exact determination of the shape and dimensions of the Globe, prompted the project of reforming the foundations of the metric system, while expanding it and making it international. Not, as it was mistakenly assumed for a certain time, that the Association had the unscientific thought of modifying the length of the metre, in order to conform exactly to its historical definition according to the new values that would be found for the terrestrial meridian. But, busy combining the arcs measured in the different countries and connecting the neighbouring triangulations, geodesists encountered, as one of the main difficulties, the unfortunate uncertainty which reigned over the equations of the units of length used. Adolphe Hirsch, General Baeyer and Colonel Ibáñez decided, in order to make all the standards comparable, to propose to the Association to choose the metre for geodetic unit, and to create an international prototype metre differing as little as possible from the In 1867 at the second General Conference of the International Association of Geodesy held in Berlin, the question of an international standard unit of length was discussed in order to combine the measurements made in different countries to determine the size and shape of the Earth. According to a preliminary proposal made in Neuchâtel the precedent year, the General Conference recommended the adoption of the metre in replacement of the toise of Bessel, the creation of an International Metre Commission, and the foundation of a World institute for the comparison of geodetic standards, the first step towards the creation of the International Bureau of Weights and Measures. Saint Petersburg Academy. Ferdinand Rudolph Hassler's metrological and geodetic work also had a favourable response in Russia. In 1869, the Saint Petersburg Academy of Sciences sent to the French Academy of Sciences a report drafted by Otto Wilhelm von Struve, Heinrich von Wild, and Moritz von Jacobi, whose theorem has long supported the assumption of an ellipsoid with three unequal axes for the figure of the Earth, inviting his French counterpart to undertake joint action to ensure the universal use of the metric system in all scientific work. The French Academy of Sciences and the Bureau des Longitudes in Paris drew the attention of the French government to this issue. In November 1869, Napoleon III issued invitations to join the International Metre Commission. The International Metre Commission (1870/1872). The French government gave practical support to the creation of an International Metre Commission, which met in Paris in 1870 and again in 1872 with the participation of about thirty countries. There was much discussion within this commission, considering the opportunity either to keep as definitive the units represented by the standards of the Archives, or to return to the primitive definitions, and to correct the units to bring them closer to them. Since its origin, the metre has kept a double definition; it is both the ten-millionth part of the quarter meridian and the length represented by the "Mètre des Archives". The first is historical, the second is metrological. The first solution prevailed, in accordance with common sense and in accordance with the advice of the French Academy of Sciences. Abandoning the values represented by the standards, would have consecrated an extremely dangerous principle, that of the change of units to any progress of measurements; the Metric System would be perpetually threatened with change, that is to say with ruin. Thus the Commission called for the creation of a new international prototype metre which length would be as close as possible to that of the "Mètre des Archives" and the arrangement of a system where national standards could be compared with it. At the session on 12 October 1872 of the Permanent Committee of the International Metre Commission, which was to become the International Committee for Weights and Measures, Carlos Ibáñez e Ibáñez de Ibero was elected president. On 19 April 1875, Ibáñez' presidency was confirmed and Adolphe Hirsch was elected secretary of the International Committee for Weights and Measures. On 6 May 1873 during the 6th session of the French section of the Metre Commission, Henri Étienne Sainte-Claire Deville cast a 20-kilogram platinum-iridium ingot from Matthey in his laboratory at the École normale supérieure (Paris). On 13 May 1874, 250 kilograms of platinum-iridium to be used for several national prototypes of the metre was cast at the Conservatoire national des arts et métiers. When a conflict broke out regarding the presence of impurities in the metre-alloy of 1874, a member of the Preparatory Committee since 1870 and president of the Permanent Committee of the International Metre Commission, Carlos Ibáñez e Ibáñez de Ibero intervened with the French Academy of Sciences to rally France to the project to create an International Bureau of Weights and Measures equipped with the scientific means necessary to redefine the units of the metric system according to the progress of sciences. In fact, the chemical analysis of the alloy produced in 1874 by the French section revealed contamination by ruthenium and iron which led the International Committee for Weights and Measures to reject, in 1877, the prototypes produced by the French section from the 1874 alloy. It also seemed at the time that the production of prototypes with an X profile was only possible through the extrusion process, which resulted in iron contamination. However, it soon turned out that the prototypes designed by Henri Tresca could be produced by milling. The 1875 Metre Convention and the founding of the BIPM. The principal tasks facing the delegates at the 1875 conference was the replacement of the existing metre and kilogram artefacts that were held by the French Government and the setting up of an organization to administer the maintenance of standards around the globe. The conference did not concern itself with other units of measure. The conference had undertones of Franco-German political manoeuvring, particularly since the French had been humiliated by the Prussians during the war a few years previously. Although France lost control of the metric system, they ensured that it passed to international rather than German control and that the international headquarters were in Paris. Two members of the Permanent Committee of the International Metre Commission, the German astronomer, Wilhelm Julius Foerster, director of the Berlin Observatory and director of the German Weights and Measures Service, and the Swiss geodesist of German origin, Adolphe Hirsch were also among the main architects of the Metre Convention. While the German astronomer Wilhelm Julius Foerster along with the Russian and Austrian representatives boycotted the Permanent Committee of the International Metre Commission in order to prompt the reunion of the Diplomatic Conference of the Metre and to promote the foundation of a permanent International Bureau of Weights and Measures, Adolphe Hirsch, delegate of Switzerland at this Diplomatic Conference in 1875, conformed to the opinion of Italy and Spain to create, in spite of French reluctance, the International Bureau of Weights and Measures in France as a permanent institution at the disadvantage of the "Conservatoire national des arts et métiers". Spain notably supported France for this outcome and the first president of the International Committee for Weights and Measures, the Spanish geodesist, Carlos Ibáñez e Ibáñez de Ibero received the Grand Officer medal of the Légion d'Honneur for his diplomatic role on this issue and was awarded the Poncelet Prize for his scientific contribution to metrology. Indeed, as Carlos Ibáñez e Ibáñez de Ibero was collaborating with the French on the extension of the arc measurement of Delambre and Méchain since 1853, and was president of both the Permanent Committee of the International Metre Commission since 1872 and the Permanent Commission of the International Association of Geodesy since 1874, he was to play a pivotal role in reconciling French and German interests. The Metre Convention was signed on 20 May 1875 in Paris and the International Bureau of Weights and Measures was created under the supervision of the International Committee for Weights and Measures, presided by Carlos Ibáñez e ibáñez de Ibero. The 1889 General Conference on Weights and Measures. In 1889, the General Conference on Weights and Measures, presided by Alfred Descloizeaux, met at the Pavillon de Breteuil, the seat of the International Bureau of Weights and Measures. It performed the first great deed dictated by the motto inscribed in the pediment of the splendid edifice that is the metric system: "A tous les temps, à tous les peuples" (For all times, to all peoples); and this deed consisted in the approval and distribution, among the governments of the states supporting the Metre Convention, of prototype standards of hitherto unknown precision intended to propagate the metric unit throughout the whole world. For metrology the matter of expansibility was fundamental; as a matter of fact, the temperature measuring error related to the length measurement in proportion to the expansibility of the standard and the constantly renewed efforts of metrologists to protect their measuring instruments against the interfering influence of temperature revealed clearly the importance they attached to the expansion-induced errors. It was common knowledge, for instance, that effective measurements were possible only inside a building, the rooms of which were well protected against the changes in outside temperature, and the very presence of the observer created an interference against which it was often necessary to take strict precautions. Thus, the Contracting States also received a collection of thermometers whose accuracy made it possible to ensure that of length measurements. The international prototype metre would also be a "line standard"; that is, the metre was defined as the distance between two lines marked on the bar, so avoiding the wear problems of end standards. The comparison of the new prototypes of the metre with each other involved the development of special measuring equipment and the definition of a reproducible temperature scale. The BIPM's thermometry work led to the discovery of special alloys of iron–nickel, in particular invar, whose practically negligible coefficient of expansion made it possible to develop simpler baseline measurement methods, and for which its director, the Swiss physicist Charles-Edouard Guillaume, was granted the Nobel Prize in Physics in 1920. Guillaume's Nobel Prize marked the end of an era in which metrology was leaving the field of geodesy to become an autonomous scientific discipline able of redefining the metre through technological applications of physics. On the other hand, the fondation of the United States Coast and Geodetic Survey by Ferdinand Rudolph Hassler paved the way to the actual definition of the metre, with Charles Sanders Peirce being the first to experimentally link the metre to the wavelength of a spectral line. Albert Abraham Michelson soon took up the idea and improved it. Additional roles for standardized time. In 1987 the International Bureau of Weights and Measures took over some of the work of the International Time Bureau when it was dissolved in 1987. The remaining tasks were assigned to the International Earth Rotation and Reference Systems Service (IERS), which replaced the International Polar Motion Service and the earth-rotation section of the International Time Bureau. Already in 1936, irregularities in the speed of Earth's rotation due to the unpredictable movement of air and water masses were discovered through the use of quartz clocks. They implied that the Earth's rotation was an imprecise way of determining time. In 1967, the second was redefined based on atomic clocks. resulting in the International Atomic Time (TAI). The International Bureau of Weights and Measures began to establish atomic clocks world-wide, numbering more than 450 currently. The 2019 revision of the SI. The International System of Units (SI, abbreviated from the French "Système international (d'unités)"), the modern form of the metric system was revised in 2019. It is the only system of measurement with an official status in nearly every country in the world. It comprises a coherent system of units of measurement starting with seven base units, which are the second (the unit of time with the symbol s), metre (length, m), kilogram (mass, kg), ampere (electric current, A), kelvin (thermodynamic temperature, K), mole (amount of substance, mol), and candela (luminous intensity, cd). Since 2019, the magnitudes of all SI units have been defined by declaring exact numerical values for seven "defining constants" when expressed in terms of their SI units. These defining constants are the hyperfine transition frequency of caesium Δ"ν"Cs, the speed of light in vacuum "c", the Planck constant "h", the elementary charge "e", the Boltzmann constant "k", the Avogadro constant "N"A, and the luminous efficacy "K"cd. Directors. Since its establishment, the directors of the BIPM have been:
4741
49150429
https://en.wikipedia.org/wiki?curid=4741
Bayonne
Bayonne () is a city in southwestern France near the Spanish border. It is a commune and one of two subprefectures in the Pyrénées-Atlantiques department, in the Nouvelle-Aquitaine region. Bayonne is located at the confluence of the Nive and Adour Rivers, in the northern part of the cultural region of the Basque Country. It is the seat of the Communauté d'agglomération du Pays Basque which roughly encompasses the western half of Pyrénées-Atlantiques, including the coastal city of Biarritz. The area also constitutes the southern part of Gascony, where the Aquitaine Basin joins the beginning of the Pre-Pyrenees. Together with nearby Anglet, Biarritz, Saint-Jean-de-Luz and several smaller communes, Bayonne forms an urban area with 273,137 inhabitants in the 2018 census, 51,411 of whom lived in the commune of Bayonne proper. It is also a part of Basque Eurocity Bayonne-San Sebastián. The site on the left bank of the Nive and the Adour was probably occupied before ancient times; a fortified enclosure was attested to in the 1st century, while the Tarbelli occupied the territory. Archaeological studies have confirmed the presence of a Roman castrum in the 4th and 5th centuries. In 1023, Bayonne was the capital of Labourd. In the 12th century, it extended to the confluence of the Nive River and beyond. During that time, its first bridge spanning the Adour was built. The city came under English control in 1152 through the marriage of Eleanor of Aquitaine; it became important commercially and, to a lesser degree, militarily thanks to maritime trade. In 1177, Richard the Lion Heart of England took control of the city, separating it from the Viscount of Labourd. In 1451, the city was taken by the Crown of France after the Hundred Years' War. The loss of trade with the English was followed by the river gradually filling with silt and becoming impassable to ships. As the city developed to the north, its position was weakened compared to earlier times. The district of Saint-Esprit developed initially from settlement by Sephardic Jewish refugees fleeing the Spanish expulsions dictated by the Alhambra Decree. This community brought skill in chocolate making, and Bayonne gained a reputation for chocolate. The course of the Adour was changed in 1578 by dredging under the direction of Louis de Foix, and the river returned to its former mouth. Bayonne flourished after regaining the maritime trade that it had lost for more than a hundred years. In the 17th century, the city was fortified by Vauban, whose works were followed as models of defense for 100 years. In 1814, Bayonne and its surroundings were the scene of fighting between the Napoleonic troops and the Spanish-Anglo-Portuguese coalition led by the Duke of Wellington. It was the last time the city was under siege. In 1951, the Lacq gas field was discovered in the region; most of its extracted oil and sulphur are shipped from the port of Bayonne. During the second half of the 20th century, many housing estates were built, forming new districts on the periphery. The city developed to form a conurbation with Anglet and Biarritz; the agglomeration became the heart of a vast Basque-Landes urban area. In 2014, Bayonne was a commune with more than 45,000 inhabitants, the heart of the urban area of Bayonne and of the "Agglomeration Côte Basque-Adour". This includes Anglet and Biarritz. It is an important part of the Basque "Bayonne-San Sebastián Eurocity" and it plays the role of economic capital of the Adour basin. Modern industries—metallurgy and chemicals—have been established to take advantage of procurement opportunities and sea shipments through the harbour. Business services today represent the largest source of employment. Bayonne is also a cultural capital, a city with strong Basque and Gascon influences, and a rich historical past. Its heritage is expressed in its architecture, the diversity of collections in museums, its gastronomic specialties, and traditional events such as the noted Fêtes de Bayonne. The inhabitants of the commune are known as "Bayonnais" or "Bayonnaises". Toponymy. Etymology. While the modern Basque spelling is "Baiona" and the same in Gascon Occitan, "the name "Bayonne" poses a number of problems both historical and linguistic which have still not been clarified". There are different interpretations of its meaning. The termination "-onne" in "Bayonne" can come from many in hydronyms "-onne" or toponyms derived from that. In certain cases the element "-onne" follows an Indo-European theme: "*ud-r/n" (Greek "húdōr" giving hydro, Gothic "watt" meaning "water") hence "*udnā" meaning "water" giving "unna" then "onno" in the glossary of Vienne. "Unna" therefore would refer to the Adour. This toponymic type evoking a river traversing a locality is common. The appellative "unna" seems to be found in the name of the Garonne ("Garunna" 1st century; "Garonna" 4th century). However, it is possible to see a pre-Celtic suffix "-ona" in the name of the Charente ("Karantona" in 875) or the Charentonne ("Carentona" in 1050). It could also be an augmentative Gascon from the original Latin radical "Baia-" with the suffix "-ona" in the sense of "vast expanse of water" or a name derived from the Basque "bai" meaning "river" and "ona" meaning "good", hence "good river". The proposal by Eugene Goyheneche repeated by Manex Goyhenetche and supported by Jean-Baptiste Orpustan is "bai una", "the place of the river" or "bai ona" "hill by the river"—"Ibai" means "river" in Basque and "muinoa" means "hill". "It has perhaps been lost from sight that many urban place names in France, from north to south, came from the element "Bay-" or "Bayon-" such as: Bayons, Bayonville, Bayonvillers and pose the unusual problem of whether they are Basque or Gascon" adds Pierre Hourmat. However, the most ancient form of Bayonne: "Baiona", clearly indicates a feminine or a theme of "-a" whereas this is not the case for Béon or Bayon. In addition, the "Bayon-" in Bayonville or Bayonvillers in northern France is clearly the personal Germanic name "Baio". Old attestations. The names of the Basque province of Labourd and the locality of Bayonne have been attested from an early period with the place name "Bayonne" appearing in the Latin form "Lapurdum" after a period during which the two names could in turn designate a Viscounty or Bishopric. "Labourd" and "Bayonne" were synonymous and used interchangeably until the 12th century before being differentiated: Labord for the province and Bayonne for the city. The attribution of Bayonne as "Civitas Boatium", a place mentioned in the Antonine Itinerary and by Paul Raymond in his 1863 dictionary, has been abandoned. The city of the "Boïates" may possibly be La Teste-de-Buch but is certainly not Bayonne. The following table details the origins of Labord, Bayonne, and other names in the commune. Sources: Origins: History. Prehistory. In the absence of accurate objective data there is some credence to the probable existence of a fishing village on the site in a period prior to ancient times. Numerous traces of human occupation have been found in the Bayonne region from the Middle Paleolithic especially in the discoveries at Saint-Pierre-d'Irube, a neighbouring locality. On the other hand, the presence of a mound about high has been detected in the current Cathedral Quarter overlooking the Nive, which formed a natural protection and a usable port on the left bank of the Nive. At the time, the mound was surrounded north and west by the Adour swamps. At its foot lies the famous "Bayonne Sea"—the junction of the two rivers—which may have been about wide between Saint-Esprit and the Grand Bayonne and totally covered the current location of Bourg-Neuf (in the district of Petit Bayonne). To the south, the last bend of the Nive widens near the Saint-Léon hills. Despite this, the narrowing of the Adour valley allows easier crossing than anywhere else along the entire length of the estuary. In conclusion, the strategic importance of this height was so obvious it must be presumed that it has always been inhabited. Ancient times. The oldest documented human occupation site is located on a hill overlooking the Nive and its confluence with the Adour. In the 1st century AD, during the Roman occupation, Bayonne already seems to have been of some importance since the Romans surrounded the city with a wall to keep out the Tarbelli, Aquitani, or the proto-Basque who then occupied a territory that extended south of modern-day Landes, to the modern French Basque country, the Chalosse, the valleys of the Adour, the mountain streams of Pau, Pyrénées-Atlantiques, and to the Gave d'Oloron. The archaeological discoveries of October and November 1995 provided a shred of evidence to support this projection. In the four layers of sub-soil along the foundation of the Gothic cathedral (in the "apse of the cathedral" area), a 2-metre depth was found of old objects from the end of the 1st century—in particular sigillated Gallic ceramics from Montans imitating Italian styles, thin-walled bowls, and fragments of amphorae. In the "southern sector" near the cloister door, there were objects from the second half of the 1st century as well as coins from the first half of the 3rd century. A very high probability of human presence, not solely military, seems to provisionally confirm the occupation of the site at least around the third century. A Roman castrum dating to the end of the 4th century has been proven as a fortified place of Novempopulania. Named "Lapurdum", the name became the name of the province of "Labourd". According to Eugene Goyheneche, the name "Baiona" designated the city, the port, and the cathedral while that of "Lapurdum" was only a territorial designation. This Roman settlement was strategic as it allowed the monitoring of the trans-Pyrenean roads and of local people rebellious to the Roman power. The construction covered 6 to 10 hectares according to several authors. Middle Ages. The geographical location of the locality at the crossroads of a river system oriented from east to west and the road network connecting Europe to the Iberian Peninsula from north to south, predisposed the site to the double role of fortress and port. The city, after being Roman, alternated between the Vascones and the English for three centuries from the 12th to the 15th century. The Romans left the city in the 4th century and the Basques, who had always been present, dominated the former Novempopulania province between the Garonne, the Ocean, and the Pyrénées. Novempopulania was renamed Vasconia and then Gascony after a Germanic deformation (resulting from the Visigoth and Frankish invasions). Basquisation of the plains region was too weak against the advance of romanization. From the mixture between the Basque and Latin language Gascon was created. Documentation on Bayonne for the period from the High Middle Ages are virtually nonexistent, with the exception of two Norman intrusions: one questionable in 844 and a second attested in 892. When Labourd was created in 1023, Bayonne was the capital and the Viscount resided there. The history of Bayonne proper started in 1056 when Raymond II the Younger, Bishop of Bazas, had the mission to build the Church of Bayonne The construction was under the authority of Raymond III of Martres, Bishop of Bayonne from 1122 to 1125, combined with Viscount Bertrand for the Romanesque cathedral, the rear of which can still be seen today, and the first wooden bridge across the Adour extending the Mayou bridge over the Nive, which inaugurated the heyday of Bayonne. From 1120, new districts were created under population pressure. The development of areas between the old Roman city of Grand Bayonne and the Nive also developed during this period, then between the Nive and the Adour at the place that became Petit Bayonne. A Dominican Order Convent was located there in 1225 then that of the Cordeliers in 1247. Construction of and modifications to the defences of the city also developed to protect the new districts. In 1130, the King of Aragon Alfonso the Battler besieged the city without success. Bayonne became an Angevin possession when Eleanor of Aquitaine married Henry Plantagenet, the future king of England, in 1152. This alliance gave Bayonne many commercial privileges. The Bayonnaises became carriers of Bordeaux wines and other south-western products like resin, ham, and woad to England. Bayonne was then an important military base. In 1177, King Richard separated the Viscounty of Labourd whose capital then became Ustaritz. Like many cities at the time, in 1215 Bayonne obtained the award of a municipal charter and was emancipated from feudal powers. The official publication, in 1273, of a Coutume unique to the city, remained in force for five centuries until the separation of Bayonne from Labourd. Bayonnaise industry at that time was dominated by shipbuilding: wood (oak, beech, chestnut from the Pyrenees, and pine from Landes) being overabundant. There was also maritime activity in providing crews for whaling, commercial marine or, and it was often so at a time when it was easy to turn any merchant ship into a warship, the English Royal Navy. Renaissance and modern times. Jean de Dunois – a former companion at arms of Joan of Arc—captured the city on 20 August 1451 and annexed it to the Crown "without making too many victims", but at the cost of a war indemnity of 40,000 gold Écus payable in a year,—thanks to the opportunism of the bishop who claimed to have seen "a large white cross surmounted by a crown which turns into a fleur-de-lis in the sky" to dissuade Bayonne from fighting against the royal troops. The city continued to be fortified by the kings of France to protect it from danger from the Spanish border. In 1454, Charles VII created a separate judicial district: the "Seneschal of Lannes" a "single subdivision of Guyenne during the English period" which had jurisdiction over a wide area including Bayonne, Dax and Saint-Sever and which exercised civil justice, criminal jurisdiction within the competence of the district councilors. Over time, the "Seneschal of the Sword", which was at Dax, lost any role other than protocol, and Bayonne, along with Dax and Saint-Sever, became the de facto seat of a separate Seneschal under the authority of a "lieutenant-general of the Seneschal". In May 1462, King Louis XI authorized the holding of two annual fairs by letters patent after signing the Treaty of Bayonne after which it was confirmed by the coutoumes of the inhabitants in July 1472 following the death of Charles de Valois, Duke de Berry, the king's brother. At the time the Spanish Inquisition raged in the Iberian Peninsula, Spanish and Portuguese Jews fled Spain and also later, Portugal, then settled in Southern France, including in Saint-Esprit (Pyrénées-Atlantiques), a northern district of Bayonne located along the northern bank of the Adour river. They brought with them chocolate and the recipe for its preparation. In 1750, the Jewish population in Saint-Esprit (Pyrénées-Atlantiques) is estimated to have reached about 3,500 people. The golden age of the city ended in the 15th century with the loss of trade with England and the silting of the port of Bayonne created by the movement of the course of the Adour to the north. At the beginning of the 16th century Labourd suffered the emergence of the plague. Its path can be tracked by reading the "Registers". In July 1515, the city of Bayonne was "prohibited to welcome people from plague-stricken places" and on 21 October, "we inhibit and prohibit all peasants and residents of this city [...] to go Parish Bidart [...] because of the contagion of the plague". On 11 April 1518, the plague raged in Saint-Jean-de-Luz and the city of Bayonne "inhibited and prohibited for all peasants and city inhabitants and other foreigners to maintain relationships at the location and Parish of Saint-Jean-de-Luz where people have died of the plague". On 11 November 1518, the plague was present in Bayonne to the point that in 1519 the city council moved to the district of Brindos (Berindos at the time) in Anglet. In 1523, Marshal Odet of Foix, Viscount of Lautrec resisted the Spaniards under Philibert of Chalon in the service of Charles V and lifted the siege of Bayonne. It was at Château-Vieux that the ransom demand for the release of Francis I, taken prisoner after his defeat at the Battle of Pavia, was gathered. The meeting in 1565 between Catherine de Medici and the envoy of Philip II: the Duke of Alba, is known as the "Interview of Bayonne". At the time that Catholics and Protestants tore each other apart in parts of the kingdom of France, Bayonne seemed relatively untouched by these troubles. An iron fist from the city leaders did not appear to be unknown. In fact, they never hesitated to use violence and criminal sanctions for keeping order in the name of the "public good". Two brothers, Saubat and Johannes Sorhaindo who were both lieutenants of the mayor of Bayonne in the second half of the 16th century, perfectly embody this period. They often wavered between Catholicism and Protestantism but always wanted to ensure the unity and prestige of the city. In the 16th century, the king's engineers, under the direction of Louis de Foix, were dispatched to rearrange the course of the Adour by creating an estuary to maintain the river bed. The river discharged in the right place to the Ocean on 28 October 1578. The port of Bayonne then attained a greater level of activity. Fishing for cod and whale ensured the wealth of fishermen and shipowners. From 1611 to 1612, the college Principal of Bayonne was a man of 26 years old with a future: Cornelius Jansen known as "Jansénius", the future Bishop of Ypres. Bayonne became the birthplace of Jansenism, an austere science which strongly disrupted the monarchy of Louis XIV. During the sporadic conflicts that troubled the French countryside from the mid 17th century, Bayonne peasants were short of powder and projectiles. They attached the long hunting knives in the barrels of their muskets and that way they fashioned makeshift spears later called "bayonets". In that same century, Vauban was charged by Louis XIV to fortify the city. He added a citadel built on a hill overlooking the district of "San Espirit Cap deou do Punt". French Revolution and Empire. Activity in Bayonne peaked in the 18th century. The Chamber of Commerce was founded in 1726. Trade with Spain, the Netherlands, the Antilles, the cod fishery off the shores of Newfoundland, and construction sites maintained a high level of activity in the port. In 1792, the district of Saint-Esprit (that revolutionaries renamed "Port-de-la-Montagne") located on the right bank of the Adour, was separated from the city and renamed "Jean-Jacques Rousseau". It was reunited with Bayonne on 1 June 1857. For 65 years, the autonomous commune was part of the department of Landes. In 1808, at the Château of Marracq, the act of abdication of the Spanish king Charles IV in favour of Napoleon was signed under the "friendly pressure" of the Emperor. In the process, the Bayonne Statute was initialed as the first Spanish constitution. Also in 1808, the French Empire imposed on the Duchy of Warsaw the Convention of Bayonne to buy from France the debts owed to it by Prussia. The debt, amounting to more than 43 million francs in gold, was bought at a discounted rate of 21 million francs. However, although the duchy made its payments in installments to France over a four-year period, Prussia was unable to pay it (due to a very large indemnity it owed to France resulting from the Treaties of Tilsit), causing the Polish economy to suffer heavily. Trade was the wealth of the city in the 18th century but suffered greatly in the 19th century, severely sanctioned by conflict with Spain, its historic trading partner in the region. The Siege of Bayonne marked the end of the period with the surrender of the Napoleonic troops of Marshal Jean-de-Dieu Soult who were defeated by the coalition led by Wellington on 5 May 1814. 19th and 20th centuries. In 1854, the railway arrived from Paris bringing many tourists eager to enjoy the beaches of Biarritz. Bayonne turned instead to the steel industry with the forges of the Adour. The Port took on an industrial look but its slow decline seemed inexorable in the 19th century. The discovery of the Lacq gas field restored a certain dynamism. The Treaty of Bayonne was concluded on 2 December 1856. It overcame the disputes in fixing the Franco-Spanish border in the area extending from the mouth of the Bidassoa to the border between Navarre and Aragon.The city built three light railway lines to connect to Biarritz at the beginning of the 20th century. The most direct line, that of the "Tramway Bayonne-Lycée–Biarritz" was operated from 1888 to 1948. In addition, a line further north served Anglet, operated by the "Chemin de fer Bayonne-Anglet-Biarritz" company from 1877 to 1953. Finally, a line following the Adour to its mouth and to the Atlantic Ocean by the bar in Anglet, was operated by "VFDM réseau basque" from 1919 to 1948. On the morning of 23 December 1933, sub-prefect Anthelme received Gustave Tissier, the director of the "Crédit Municipal de Bayonne". He responded well, with some astonishment, to his persistent interview. It did not surprise him to see the man unpacking what became the scam of the century. "Tissier, director of the "Crédit Municipal", was arrested and imprisoned under suspicion of forgery and misappropriation of public funds. He had issued thousands of false bonds in the name of "Crédit Municipal" [...]" This was the beginning of the Stavisky Affair which, together with other scandals and political crises, led to the Paris riots of 6 February 1934. The World Wars. The 249th Infantry Regiment, created from the 49th Infantry Regiment, was engaged in operations in the First World War, including action at Chemin des Dames, especially on the plateau of Craonne. 700 Bayonnaises perished in the conflict. A centre for engagement of foreign volunteers was established in August 1914, in Bayonne. Many nationalities were represented, particularly the Spanish, the Portuguese, the Czechs, and the Poles. During the Second World War, Bayonne was occupied by the 3rd SS Panzer Division Totenkopf from 27 June 1940 to 23 August 1944. On 5 April 1942, the Allies made a landing attempt in Bayonne but after a barge penetrated the Adour with great difficulty, the operation was cancelled. On 21 August 1944, after blowing up twenty ships in port, German troops withdrew. On the 22nd, a final convoy of five vehicles passed through the city. It transported Gestapo Customs agents and some elements of the "Feldgendarmerie". One or more Germans opened fire with machine guns killing three people. On the 23rd, there was an informal and immediate installation of a "special municipal delegation" by the young deputy prefect Guy Lamassoure representing the Provisional Government of the French Republic which had been established in Algiers since 27 June. Policy and administration. List of mayors under the Ancient Régime. The Gramont family provided captains and governors in Bayonne from 1472 to 1789 as well as mayors, a post which became hereditary from 28 January 1590 by concession of Henry IV to Antoine II of Gramont. From the 15th century, they resided in the Château Neuf then in the Château-Vieux from the end of the 16th century: Modern times. List of Successive Mayors Cantons of Bayonne. As per the Decree of 22 December 1789, Bayonne was part of two cantons: Bayonne-North-east, which includes part of Bayonne commune plus Boucau, Saint-Pierre-d'Irube, Lahonce, Mouguerre, and Urcuit; and Bayonne Northwest which consisted of the rest of Bayonne commune plus Anglet, Arcangues, and Bassussarry. In a first revision of cantons in 1973, three cantons were created from the same total; geographic area: Bayonne North, Bayonne East, and Bayonne West. A further reconfiguration, in 1982, focused primarily on Bayonne and, apart from Bayonne North Canton, which also includes Boucau, the cantons of Bayonne East and Bayonne West did not change. Starting from the 2015 French departmental elections which took place on 22 and 29 March, a new division took effect following the decree of 25 February 2014 Once again three cantons centred on Bayonne are defined: Bayonne-1—with part of Anglet; Bayonne-2—which includes Boucau; and Bayonne-3 now define the cantonal territorial division of the area. Judicial and administrative proceedings. Bayonne is the seat of many courts for the region. It falls under the jurisdiction of the "Tribunal d'instance" (District court) of Bayonne, the "Tribunal de grande instance" (High Court) of Bayonne, the "Cour d'appel" (Court of Appeal) of Pau, the "Tribunal pour enfants" (Juvenile court) of Bayonne, the "Conseil de prud'hommes" (Labour Court) of Bayonne, the "Tribunal de commerce" (Commercial Court) of Bayonne, the "Tribunal administratif" (Administrative tribunal) of Pau, and the "Cour administrative d'appel" (Administrative Court of Appeal) of Bordeaux. The commune has a police station, a Departmental Gendarmerie, an Autonomous Territorial Brigade of the district Gendarmerie, squadron 24/2 of Mobile Gendarmerie and a Tax collection office. Intercommunality. The commune is part of twelve inter-communal structures of which eleven are based in the commune: The city of Bayonne is part of the "Communauté d'agglomération du Pays Basque" which also includes Anglet, Biarritz, Bidart, Boucau, Hendaye and Saint-Jean-de-Luz. The statutory powers of the structure extend to economic development—including higher education and research—housing and urban planning, public transport—through Transdev—alternative and the collection and recovery waste collection and management of rain and coastal waters, the sustainable development, interregional cooperation and finally 106. In addition, Bayonne is part of the Basque Bayonne-San Sebastián Eurocity which is a European economic interest grouping (EEIG) established in 1993 based in San Sebastián. Twin towns – Sister cities. Bayonne has twinning associations with: Geography. Bayonne is located in the south-west of France on the western border between Basque Country and Gascony. It developed at the confluence of the Adour and tributary on the left bank, the Nive, 6 km from the Atlantic coast. The commune was part of the Basque province of Labourd. Geology and relief. Bayonne occupies a territory characterized by a flat relief to the west and to the north towards the Landes forest, tending to slightly raise towards the south and east. The city has developed at the confluence of the Adour and Nive from the ocean. The meeting point of the two rivers coincides with a narrowing of the Adour valley. Above this, the alluvial plain extends for nearly towards both Tercis-les-Bains and Peyrehorade, and is characterized by swampy meadows called "barthes". These are influenced by floods and high tides. Downstream from this point, the river has shaped a large, wide bed in the sand dunes, creating a significant bottleneck at the confluence. The occupation of the hill that dominates this narrowing of the valley developed through a gradual spread across the lowlands. Occupants built embankments and the aggradation from flood soil. The Nive has played a leading role in the development of the Bayonne river system in recent geological time by the formation of alluvial terraces; these form the sub-soil of Bayonne beneath the surface accumulations of silt and aeolian sands. The drainage network of the western Pre-Pyrenees evolved mostly from the Quaternary, from south-east to northwest, oriented east–west. The Adour was captured by the gaves and this system, together with the Nive, led to the emergence of a new alignment of the lower Adour and the Adour-Nive confluence. This capture has been dated to the early Quaternary (80,000 years ago). Before this capture, the Nive had deposited pebbles from the Mindel glaciation of medium to large sizes; this slowed erosion of the hills causing the bottleneck at Bayonne. After the deposit of the lowest alluvial terrace ( high at Grand Bayonne), the course of the Adour became fixed in its lower reaches. Subsequent to these deposits, there was a rise in sea level in the Holocene period (from 15,000 to 5000 years ago). This explains the invasion of the lower valleys with fine sand, peat, and mud with a thickness of more than below the current bed of the Adour and the Nive in Bayonne. These same deposits are spread across the barthes. In the late Quaternary, the current topographic physiognomy was formed—i.e. a set of hills overlooking a swampy lowland. The promontory of Bassussarry–Marracq ultimately extended to the Labourdin foothills. The Grand Bayonne hill is an example. Similarly, on the right bank of the Nive, the heights of Château-Neuf (Mocoron Hill) met the latest advance of the plateau of Saint-Pierre-d'Irube (height ). On the right bank of the Adour, the heights of Castelnau (today the citadel), with an altitude of , and Fort (today Saint-Esprit), with an altitude of , rise above the Barthes of the Adour, the Nive, Bourgneuf, Saint-Frédéric, Sainte-Croix, Aritxague, and Pontots. The area of the commune is and its altitude varies between . Hydrography. The city developed along the river Adour. The river is part of the Natura 2000 network from its source at Bagnères-de-Bigorre to its exit to the Atlantic Ocean after Bayonne, between Tarnos (Landes) for the right bank and Anglet (Pyrénées-Atlantiques) for the left bank. Apart from the Nive, which joins the left bank of the Adour after of a sometimes tumultuous course, two tributaries join the Adour in Bayonne commune: the "Ruisseau de Portou" and the "Ruisseau du Moulin Esbouc". Tributaries of the Nive are the "Ruisseau de Hillans" and the "Ruisseau d'Urdaintz" which both rise in the commune. Climate. The nearest weather station is that of Biarritz-Anglet. The climate of Bayonne is relatively similar to that of its neighbour Biarritz, described below, with fairly heavy rainfall; the oceanic climate is due to the proximity of the Atlantic Ocean. The average winter temperature is around 8 °C, and around 20 °C in summer. The lowest temperature recorded was −12.7 °C on 16 January 1985 and the highest 40.6 °C on 4 August 2003 in the 2003 European heat wave. Rains on the Basque coast are rarely persistent except during winter storms. They often take the form of intense thunderstorms of short duration. Transport. Road. Bayonne is located at the intersection of the A63 autoroute (Bordeaux-Spain) and the D1 extension of the A64 autoroute (towards Toulouse). The city is served by three interchanges—two of them on the A63: exit (Bayonne Nord) serves the northern districts of Bayonne but also allows quick access to the centre while exit (Bayonne Sud) provides access to the south and also serves Anglet. The third exit is the D1 / A64 via the Mousserolles interchange (exit Bayonne Mousserolles) which links the district of the same name and also serves the neighbouring communes of Mouguerre and Saint-Pierre-d'Irube. Bayonne was traversed by Route nationale 10 connecting Paris to Hendaye but this is now downgraded to a departmental road D810. Route nationale 117, linking Bayonne to Toulouse has been downgraded to departmental road D817. Bridges. There are several bridges over both the Nive and the Adour, linking the various districts. Coming from upstream on the Adour, there is the A63 bridge, then the Saint-Frédéric bridge which carries the D 810, then the railway bridge that replaced the old Eiffel iron bridge, the Saint-Esprit bridge, and finally the Grenet bridge. The Saint-Esprit bridge connects the Saint-Esprit district to the Amiral-Bergeret dock just upstream of the confluence with the river Nive. In 1845, the old bridge, originally made of wood, was rebuilt in masonry with seven arches supporting a deck wide. It was then called the Nemours Bridge in honour of Louis of Orleans, sixth Duke of Nemours, who laid the first stone. The bridge was finally called Saint-Esprit. Until 1868, the bridge had a moving span near the left bank. It was expanded in 1912 to facilitate the movement of horse-drawn carriages and motor vehicles. On the Nive coming from upstream to downstream, there is the A63 bridge then the "Pont Blanc" (White bridge) railway bridge, and then D810 bridge, the Génie bridge (or "Pont Millitaire"), the Pannecau bridge, the Marengo bridge leading to the covered markets, and the Mayou Bridge. The Pannecau bridge was long named "Bertaco bridge" and was rebuilt in masonry under Napoleon III. According to François Lafitte Houssat, "[...] a municipal ordinance of 1327 provided for the imprisonment of any quarrellsome woman of bad character in an iron cage dropped into the waters of the Nive River from the bridge. The practice lasted until 1780 [...]" This punishment bore the evocative name of "cubainhade". Cycling network. The commune is traversed by the "Vélodyssée". Bicycle paths are located along the left bank of the Adour, a large part of the left bank of the Nive, and along various axes of the city where there are some bicycle lanes. The city offers free bicycles on loan. Public transport. Urban network. Most of the lines of the "Chronoplus" bus network operated by the "Transdev agglomeration of Bayonne" link Bayonne to other communes in the urban transport perimeter: Anglet, Biarritz, Bidart, Boucau, Saint-Pierre-d'Irube and Tarnos The Bayonne free shuttle Bayonne serves the city centre (Grand and Petit Bayonne) by connecting several parking stations; other free shuttles perform other short trips within the commune. Interurban networks. Bayonne is connected to many cities in the western half of the department such as Saint-Jean-de-Luz and Saint-Palais by the Pyrenees-Atlantiques long-distance coach network of "Transport 64" managed by the General Council. Since the network restructuring in the summer of 2013, the lines converge on Bayonne. Bayonne is also served by services from the Landes departmental network, "XL'R". Rail transport. The Gare de Bayonne is located in the Saint-Esprit district and is an important station on the Bordeaux-Irun railway. It is also the terminus of lines leading from Toulouse to Bayonne and from Bayonne to Saint-Jean-Pied-de-Port. It is served by TGV, Intercités, Intercités de nuit, and TER Nouvelle-Aquitaine trains (to Hendaye, Saint-Jean-Pied-de-Port, Dax, Bordeaux, Pau, and Tarbes). Air transport. Bayonne is served by the Biarritz – Anglet – Bayonne Airport (IATA code: BIQ • ICAO code: LFBZ), located on the communal territories of Anglet and Biarritz. The airport was returned to service in 1954 after repair of damage from bombing during the Second World War. Demographics. In 2017, the commune had 51,228 inhabitants. Education. Bayonne commune is attached to the Academy of Bordeaux. It has an information and guidance center (CIO). As of 14 December 2015, Bayonne had 10 kindergartens, 22 elementary or primary schools (12 public and 10 private primary schools including two ikastolas). 2 public colleges (Albert Camus and Marracq colleges), 5 private colleges (La Salle Saint-Bernard, Saint Joseph, Saint-Amand, Notre-Dame and Largenté) which meet the criteria of the first cycle of second degree studies. For the second cycle,Bayonne has 3 public high schools (René-Cassin school (general education), the Louis de Foix school (general, technological and vocational education), and the Paul Bert vocational school), 4 private high schools (Saint-Louis Villa Pia (general education), Largenté, Bernat Etxepare (general and technological), and Le Guichot vocational school). There are also the Maurice Ravel Conservatory of Music, Dance, and Dramatic Art and the art school of the urban community of Bayonne-Anglet-Biarritz. Culture. Cultural festivities and events. For 550 years, every holy Thursday, Friday and Saturday the "Foire au Jambon" (Ham festival) is held to mark the beginning of the season. An annual summer festival has been held in the commune since 1932 for five days, organized around parades, bulls races, fireworks, and music in the Basque and Gascon tradition. These festivals have become the most important festive events in France in terms of attendance. Bayonne has the oldest French bullfighting tradition. A bylaw regulating the "encierro" is dated 1283: cows, oxen and bulls are released each year in the streets of Petit Bayonne during the summer festivals. The current arena, opened in 1893, is the largest in South-west France with more than 10,000 seats. A dozen bullfights are held each year, attracting the biggest names in bullfighting. Throughout summer several "novilladas" also take place. The city is a member of the "Union of French bullfighting cities". Health. Bayonne is the focus of much of the hospital services for the agglomeration of Bayonne and the southern Landes. In this area, all inhabitants are less than 35 km from a hospital offering medical, obstetrical, surgical, or psychiatric care. The hospitals for all the Basque Coast are mainly established in Bayonne (the main site of Saint-Léon and Cam-de-Prats) and also in Saint-Jean-de-Luz which has several clinics. Religion. Christian worship. Bayonne is in the Diocese of Bayonne, Lescar and Oloron, with a suffragan bishop since 2002 under the Archdiocese of Bordeaux. Monseigneur Marc Aillet has been the bishop of this diocese since 15 October 2008. The diocese is located in Bayonne in the Place Monseigneur-Vansteenberghe. Besides Bayonne Cathedral in Grand Bayonne, Bayonne has Saint-Esprit, Saint Andrew (Rue des Lisses), Arènes (Avenue of the Czech Legion), Saint-Étienne, and Saint-Amand (Avenue Marechal Soult) churches. The "Carmel of Bayonne", located in the Marracq district, has had a community of Carmelite nuns since 1858. The "Way of Baztan" (also "ruta del Baztan" or "camino Baztanés") is a way on the pilgrimage of Camino de Santiago which crosses the Pyrenees further west by the lowest pass (by the "Col de Belate", 847 m). It is the ancient road used by pilgrims descending to Bayonne then either along the coast on the "Way of Soulac" or because they landed there from England, for example, to join the French Way as soon as possible in Pamplona. The "Way of Bayonne" joins the French Way further downstream at Burgos. The Protestant church is located at the corner of Rue Albert-I st and Rue du Temple. A gospel church is located in the Saint-Esprit district where there is also a church belonging to the Gypsy Evangelical Church of the Protestant Federation of France. Jewish worship. The synagogue was built in 1837 in the Saint-Esprit district north of the town. The Jewish community of Bayonne is old—it consists of different groups of fugitives from Navarre and Portugal who established at Saint-Esprit-lès-Bayonne after the expulsion of Jews from Spain in 1492 and Portugal in 1496. In 1846, the Central Consistory moved to Saint-Esprit which was integrated with Bayonne in 1857. The Jewish Cemetery of Bayonne was established in 1689 in the Saint-Étienne neighborhood in the northern quarter of the city. It was remodeled and enlarged in the 18th and 19th century and covers and area of two hectares. Economy. Population and income tax. In 2011, the median household income tax was €22,605, placing Bayonne 28,406th place among the 31,886 communes with more than 49 households in metropolitan France. In 2011, 47.8% of households were not taxable. Employment. In 2011, the population aged from 15 to 64 years was 29,007 persons of which 70.8% were employable, 60.3% in employment and 10.5% unemployed. While there were 30,012 jobs in the employment area, against 29,220 in 2006, and the number of employed workers residing in the employment area was 17,667, the indicator of job concentration is 169.9% which means that the employment area offers nearly two jobs for every available worker. Businesses and shops. Bayonne is the economic capital of the agglomeration of Bayonne and southern Landes. The table below details the number of companies located in Bayonne according to their industry: The table below shows employees by business establishments in terms of numbers: The following comments apply to the two previous tables: In 2013, 549 new establishments were created in Bayonne including 406 sole proprietorships. Workshops and Industry. Bayonne has few of such industries, as indicated in the previous tables. There is "Plastitube" specializing in plastic packaging (190 employees). The Izarra liqueur company set up a distillery in 1912 at Quai Amiral-Bergeret and has long symbolized the economic wealth of Bayonne. Industrial activities are concentrated in the neighbouring communes of Boucau, Tarnos (Turbomeca), Mouguerre, and Anglet. Bayonne is known for its fine chocolates, produced in the town for 500 years, and Bayonne ham, a cured ham seasoned with peppers from nearby Espelette. Izarra, the liqueur made in bright green or yellow colours, is distilled locally. It is said by some that Bayonne is the birthplace of mayonnaise, supposedly a corruption of "Bayonnaise", the French adjective describing the city's people and produce. Now bayonnaise can refer to a particular mayonnaise flavoured with the Espelette chillis. Bayonne is now the centre of certain craft industries that were once widespread, including the manufacture of "makilas", traditional Basque walking-sticks. The Fabrique Alza just outside the city is known for its "palas", bats used in "pelota", the traditional Basque sport. Service activities. The active tertiary sector includes some large retail chains such as those detailed by geographer Roger Brunet: BUT (240 staff), Carrefour (150 staff), E.Leclerc (150 staff), Leroy Merlin (130 staff), and Galeries Lafayette (120 employees). Banks, cleaning companies (Onet, 170 employees), and security (Brink's, 100 employees) are also major employers in the commune, as is urban transport which employs nearly 200 staff. Five health clinics, providing a total of more than 500 beds, each employ 120 to 170 staff. The port of Bayonne. The port of Bayonne is located at the mouth of the Adour, downstream of the city. It also occupies part of communes of Anglet and Boucau in Pyrenees-Atlantiques and Tarnos in Landes. It benefits greatly from the natural gas field of Lacq to which it is connected by pipeline. This is the ninth largest French port for trade with an annual traffic of about 4.2 million tonnes of which 2.8 is export. It is also the largest French port for export of maize. It is the property of the Aquitaine region who manage and control the site. Metallurgical products movement are more than one million tons per year and maize exports to Spain vary between 800,000 and 1 million tons. The port also receives refined oil products from the TotalEnergies oil refinery at Donges (800,000 tons per year). Fertilizers are a traffic of 500,000 tons per year and sulphur from Lacq, albeit in sharp decline, is 400,000 tons. The port also receives Ford and General Motors vehicles from Spain and Portugal and wood both tropical and from Landes. Tourism services. Due to its proximity to the ocean and the foothills of the Pyrenees as well as its historic heritage, Bayonne has developed important activities related to tourism. On 31 December 2012, there were 15 hotels in the city offering more than 800 rooms to visitors, but there were no camp sites. The tourist infrastructure in the surrounding urban area of Bayonne complements the local supply with around 5800 rooms spread over nearly 200 hotels and 86 campsites offering over 14,000 beds. The Information site of the Bayonne Tourist Office, VisitBayonne.com is featured on the Global Visit List Sights. The Nive divides Bayonne into Grand Bayonne and Petit Bayonne with five bridges between the two, both quarters still being backed by Vauban's walls. The houses lining the Nive are examples of Basque architecture, with half-timbering and shutters in the national colours of red and green. The much wider Adour is to the north. The Pont Saint-Esprit connects Petit Bayonne with the Quartier Saint-Esprit across the Adour, where the massive Citadelle and the railway station are located. Grand Bayonne is the commercial and civic hub, with small pedestrianised streets packed with shops, plus the cathedral and the Hôtel de Ville. The Cathédrale Sainte-Marie is a Gothic-style building constructed between the 13th and 15th centuries. The tower spires were not added until the 19th century, during a substantial restoration project. The cathedral houses the shrine of Saint-Léon de Carentan, 9th-century Bishop of Bayonne, and is a recognized UNESCO World Heritage Site. Nearby is the Château Vieux, some of which dates back to the 12th century, where the governors of the city were based, including the English Black Prince. The Musée Basque is an ethnographic museum of the entire Basque Country. Opened in February 1924, the museum has special exhibitions on Basque agriculture and history, seafaring, "pelota", and handicrafts. The Musée Bonnat began with a large collection bequeathed by the local-born painter Léon Bonnat. The museum is one of the best galleries in south west France and has paintings by Edgar Degas, El Greco, Sandro Botticelli, and Francisco Goya, among others. At the back of Petit Bayonne is the Château Neuf, among the ramparts. Now an exhibition space, it was started by the newly arrived French in 1460 to control the city. The walls nearby have been opened to visitors. They are important for plant life now and Bayonne's botanic gardens adjoin the walls on both sides of the Nive. The area across the Adour is largely residential and industrial, with much demolished to make way for the railway. The Saint-Esprit church was part of a bigger complex built by Louis XI to care for pilgrims to Santiago de Compostela. It is home to a wooden "Flight into Egypt" sculpture. Overlooking the quarter is Vauban's 1680 Citadelle. The soldiers of Wellington's army who died besieging the citadelle in 1813 are buried in the nearby English Cemetery, visited by Queen Victoria and other British dignitaries when staying in Biarritz. The distillery of the famous local liqueur Izarra is located on the northern bank of the Adour and is open to visitors.
4742
114797
https://en.wikipedia.org/wiki?curid=4742
Bubblegum Crisis
is a 1987-1991 Japanese cyberpunk original video animation (OVA) series produced by Youmex and animated by AIC and Artmic. The series involves the adventures of the Knight Sabers, an all-female group of mercenaries who don powered exoskeletons and fight numerous problems, most frequently rogue robots. The success of the series spawned several sequel series. Plot. The series begins in late 2032, seven years after the Second Great Kantō earthquake has split Tokyo geographically and culturally in two. It also forced the United States of America to annex Japan with the legitimate goal of keeping the peace and preventing it from descending into anarchy. In the first episode, disparities in wealth are shown to be more pronounced than in previous periods in postwar Japan. One of the series' themes is the inability of the department to deal with threats due to political infighting, red tape, and an insufficient budget. The main adversary is Genom, a megacorporation with immense power and global influence. Its main product are boomers—synthetic cyborgs. These are also known as "cyberoids". While Boomers are intended to serve mankind, they become deadly instruments in the hands of ruthless individuals. The AD Police ("Advanced Police") are tasked with dealing with Boomer-related crimes. Setting. The setting displays strong influences from the movies "Blade Runner" and "Streets of Fire". The opening sequence of episode 1 is even modeled on that of the latter film. The humanoid robots known as "boomers" in the series were inspired by several movies, including Replicants from the aforementioned "Blade Runner", the titular cyborgs of the "Terminator" film franchise, and the Beast from the film "Krull". Suzuki explained in a 1993 "Animerica" interview the meaning behind the cryptic title: "We originally named the series 'bubblegum' to reflect a world in crisis, like a chewing-gum bubble that's about to burst." Production. The series started with Toshimichi Suzuki's intention to remake the 1982 film "Techno Police 21C". In 1985, he met Junji Fujita and the two discussed ideas, and decided to collaborate on what later became "Bubblegum Crisis". Kenichi Sonoda acted as character designer, and designed the four female leads. Masami Ōbari created the mechanical designs. Obari would also go on to direct episodes 5 and 6. Satoshi Urushihara acted as the chief production supervisor and guest character designer for Episode 7. The OVA series is eight episodes long, made as eight separate works, with lengths varying from 26 to 52 minutes. A common misunderstanding that has developed, dating back as far as at least the mid-2000s, is that the series was planned and written to be 13 episodes, and that either legal or financial issues resulted in the series having only eight episodes. The prevalence of this belief has resulted in it appearing in discussions of the series, even in Anime News Network articles and encyclopedia, and an "Otaku USA" feature from 2011. However, commentaries and interviews with production staff contradict this. Production designer Hideki Kakinuma said in commentary notes that appeared with the 2018 Animeigo release of the OVA series, "At the time there was no plan to make it into a series, each film was going to be made one at a time." Katsuhito Akiyama, director of OVA parts 1, 2, and 3 echoed this in a November 1997 interview, recalling challenges in directing the OVA parts and creating a narrative due to a lack of long-term plan, "...it was not easy to keep on producing episodes without knowing a clear plan of how many total they want us to make." Further the staff don't discuss or mention the existence of these issues hampering the project in these interviews and commentaries, which included directors, voice actresses, character designers, even AIC president Tōru Miura. Akiyama recalled the experience of working on the production as "fun". Cast. Additional voices. English: Amanda Tancredi, Chuck Denson Jr., Chuck Kinlaw, David Kraus, Eliot Preschutti, Gray Sibley, Hadley Eure, Hank Troscianiec, J. Patrick Lawlor, Jack Bowden, Jay Bryson, Kevin Reilly, Marc Garber, Marc Matney, Michael Sinterniklaas, Scott Simpson, Sean Clay, Sophia Tolar, Steve Lalla, Steve Rassin, Steve Vernon, Zach Hanner Release. In North America, AnimEigo first released "Bubblegum Crisis" to VHS and Laserdisc in 1991 in Japanese with English subtitles. The series is notable in that it was one of the few early anime series that were brought over from Japan unedited and subtitled in English. While anime has become much more popular in the years since, in 1991, it was still mostly unknown as a storytelling medium in North America. "Bubblegum Crisis" was aired in the US when it first aired on PBS affiliate Superstation KTEH in the 1990s, and STARZ!'s Action Channel in 2000. An English dub of the series was produced beginning in 1994 by AnimEigo through Southwynde Studios in Wilmington, NC, and released to VHS and Laserdisc beginning that year. A digitally-remastered compilation, featuring bilingual audio tracks and production extras, was released on DVD in 2004 by AnimEigo. The company later successfully crowdfunded a collector's edition Blu-ray release through Kickstarter in November 2013. The series was released on a regular edition Blu-ray on September 25, 2018. The series is currently available for streaming on Night Flight Plus. Soundtracks. There are eight soundtrack releases (one per OVA), as well as numerous "vocal" albums which feature songs "inspired by" the series as well as many drawn directly from it. Reception. Critical reception of "Bubblegum Crisis" has been generally positive. Raphael See of THEM Anime Reviews gave the series a rating of 4 out of 5 stars, praising the quality of the animation, the soundtrack, and the series' sense of humor. However, he suggested it was held back by a low quality dub, a lack of character development, and an inconsistent plot, saying that while some episodes were "really solid", others would leave out many major details, forcing the viewer to make their own assumptions: "Overall, not a bad watch. In fact, at times, "Bubblegum Crisis" can be really good. Unfortunately, oversights and carelessness here and there keep this series from being all it can be." Tim Henderson of Anime News Network gave the series an A− rating, praising the animation, soundtrack, story, and characters. He states that the series gets better with every passing episode, and that the final two episodes are the best of the series. Legacy. Masaki Kajishima and Hiroki Hayashi, who both worked on the "Bubblegum Crisis" OVAs, cite the show as being the inspiration for their harem series "Tenchi Muyo! Ryo-Ohki". In an interview with AIC, Hayashi described "Bubblegum Crisis" as "a pretty gloomy anime. Serious fighting, complicated human relationships, and dark Mega Tokyo." They thought it would be fun to create some comedy episodes with ideas like the girls going to the hot springs, but it was rejected by the sponsors. He also said that there was a trend to have a bunch of characters of one gender and a single one of the other gender, and asked what if Mackey (Sylia's brother) was a main character, reversing the "Bubblegum" scenario. This idea then became the basis for Tenchi. Hayashi said that Mackey is "sort of" the original model for Tenchi. Kevin Siembieda's becoming aware of "Boomers" being already in use in this caused him to change his planned name for the "Rifts" RPG which he had named after the "Boom Gun"–wielding power armor which was also renamed to "Glitter Boy". Other entries. Crossover appearances. In 1993, it appeared on "Scramble Wars", a crossover event between "Bubblegum Crisis", "Gall Force", "Genesis Survivor Gaiarth", "AD Police" and "Riding Bean". In 2023, the theme song "Konya Wa Hurricane" appeared in the series "Scott Pilgrim Takes Off". Other media. Novels. The series' creator Toshimichi Suzuki wrote two novels: Comic book. In Japan, a number of comic books were produced that featured characters and storylines based in the same universe. Some were very much thematically linked to the OVA series, while others were "one-shots" or comedy features. A number of artists participated in the creation of these comics, including Kenichi Sonoda, who had produced the original Knight Saber character designs. A North American comic based in the "Bubblegum Crisis" universe was published in English by Dark Horse Comics. Live-action movie. In May 2009 it was announced that a live-action movie of "Bubblegum Crisis" was in the early stages of production. A production agreement was signed at the 2009 Cannes Film Festival. The film was expected to be released in late 2012 with a budget of 30 million. The production staff was said to have consulted with the original anime's staff members, Shinji Aramaki and Kenichi Sonoda, to help maintain consistency with the world of the original. However, no further developments have been announced.
4745
27015025
https://en.wikipedia.org/wiki?curid=4745
Black people
Black is a racial classification of people, usually a political and skin color-based category for specific populations with a mid- to dark brown complexion. Often in countries with socially based systems of racial classification in the Western world, the term "black" is used to describe persons who are perceived as darker-skinned in contrast to other populations. It is most commonly used for people of sub-Saharan African ancestry, Indigenous Australians, and Melanesians, though it has been applied in many contexts to other groups, and is no indicator of any close ancestral relationship whatsoever. However, not all people considered "black" have dark skin and often additional phenotypical characteristics are relevant, such as certain facial and hair-texture features. Indigenous African societies do not use the term "black" as a racial identity outside of influences brought by Western cultures. Contemporary anthropologists and other scientists, while recognizing the reality of biological variation between different human populations, regard the concept of a unified, distinguishable "Black race" as socially constructed. Different societies apply different criteria regarding who is classified "black", and these social constructs have changed over time. In a number of countries, societal variables affect classification as much as skin color, and the social criteria for "blackness" vary. Some perceive the term 'black' as a derogatory, outdated, reductive or otherwise unrepresentative label, and as a result neither use nor define it, especially in African countries with little to no history of colonial racial segregation. In the anglosphere the term can carry a variety of meanings depending on the country. In the United Kingdom, "black" was historically equivalent with "person of color", a general term for non-European peoples. While the term "person of color" is commonly used and accepted in the United States, the near-sounding term "colored person" is considered highly offensive, except in South Africa, where it is a descriptor for a person of mixed race. In other regions such as Australasia, settlers applied the adjective "black" to the indigenous population. It was universally regarded as highly offensive in Australia until the 1960s and 70s. "Black" was generally not used as a noun, but rather as an adjective qualifying some other descriptor (e.g. "black ****"). As desegregation progressed after the 1967 referendum, some Aboriginals adopted the term, following the American fashion, but it remains problematic. Several American style guides, including the "AP Stylebook", changed their guides to capitalize the 'b' in 'black', following the 2020 murder of George Floyd, an African American. The "ASA Style Guide" says that the 'b' should not be capitalized. Africa. Northern Africa. Numerous communities of dark-skinned peoples are present in North Africa, some dating from prehistoric communities. Others descend from migrants via the historical trans-Saharan trade or, after the Arab invasions of North Africa in the 7th century, from slaves from the trans-Saharan slave trade in North Africa. In the 18th century, the Moroccan Sultan Moulay Ismail "the Warrior King" (1672–1727) raised a corps of 150,000 black soldiers, called his Black Guard. According to Carlos Moore, resident scholar at Brazil's University of the State of Bahia, in the 21st century Afro-multiracials in the Arab world, including Arabs in North Africa, self-identify in ways that resemble multi-racials in Latin America. He claims that darker-toned Arabs, much like darker-toned Latin Americans, consider themselves white because they have some distant white ancestry. Egyptian President Anwar Sadat had a mother who was a dark-skinned Nubian Sudanese (Sudanese Arab) woman and a father who was a lighter-skinned Egyptian. In response to an advertisement for an acting position, as a young man he said, "I am not white but I am not exactly black either. My blackness is tending to reddish". Due to the patriarchal nature of Arab society, Arab men, including during the slave trade in North Africa, enslaved more African women than men. The female slaves were often put to work in domestic service and agriculture. The men interpreted the Quran to permit sexual relations between a male master and his enslaved females outside of marriage (see Ma malakat aymanukum and sex), leading to many mixed-race children. When an enslaved woman became pregnant with her Arab master's child, she was considered as "umm walad" or "mother of a child", a status that granted her privileged rights. The child was given rights of inheritance to the father's property, so mixed-race children could share in any wealth of the father. Because the society was patrilineal, the children inherited their fathers' social status at birth and were born free. Some mixed-race children succeeded their respective fathers as rulers, such as Sultan Ahmad al-Mansur, who ruled Morocco from 1578 to 1608. He was not technically considered as a mixed-race child of a slave; his mother was Fulani and a concubine of his father. In early 1991, non-Arabs of the Zaghawa people of Sudan attested that they were victims of an intensifying Arab apartheid campaign, segregating Arabs and non-Arabs (specifically, people of Nilotic ancestry). Sudanese Arabs, who controlled the government, were widely referred to as practicing apartheid against Sudan's non-Arab citizens. The government was accused of "deftly manipulating Arab solidarity" to carry out policies of apartheid and ethnic cleansing. Sudanese Arabs are also black people in that they are culturally and linguistically Arabized indigenous peoples of Sudan of mostly Nilo-Saharan, Nubian, and Cushitic ancestry; their skin tone and appearance resembles that of other black people. American University economist George Ayittey accused the Arab government of Sudan of practicing acts of racism against black citizens. According to Ayittey, "In Sudan... the Arabs monopolized power and excluded blacks – Arab apartheid." Many African commentators joined Ayittey in accusing Sudan of practicing Arab apartheid. Sahara. In the Sahara, the native Tuareg Berber populations kept "negro" slaves. Most of these captives were of Nilo-Saharan extraction, and were either purchased by the Tuareg nobles from slave markets in the Western Sudan or taken during raids. Their origin is denoted via the Ahaggar Berber word "Ibenheren" (sing. "Ébenher"), which alludes to slaves that only spoke a Nilo-Saharan language. These slaves were also sometimes known by the borrowed Songhay term "Bella". Similarly, the Sahrawi indigenous peoples of the Western Sahara observed a class system consisting of high castes and low castes. Outside of these traditional tribal boundaries were "Negro" slaves, who were drawn from the surrounding areas. North-Eastern Africa. In Ethiopia and Somalia, the slave classes mainly consisted of captured peoples from the Sudanese-Ethiopian and Kenyan-Somali international borders or other surrounding areas of Nilotic and Bantu peoples who were collectively known as "Shanqella" and "Adone" (both analogues to "negro" in an English-speaking context). Some of these slaves were captured during territorial conflicts in the Horn of Africa and then sold off to slave merchants. The earliest representation of this tradition dates from a seventh or eighth century BC inscription belonging to the Kingdom of Damat. These captives and others of analogous morphology were distinguished as "tsalim barya" (dark-skinned slave) in contrast with the Afroasiatic-speaking nobles or "saba qayh" ("red men") or light-skinned slave; while on the other hand, western racial category standards do not differentiate between "saba qayh" ("red men"—light-skinned) or "saba tiqur" ("black men"—dark-skinned) Horn Africans (of either Afroasiatic-speaking, Nilotic-speaking or Bantu origin) thus considering all of them as "black people" (and in some case "negro") according to Western society's notion of race. Southern Africa. In South Africa, the period of colonisation resulted in many unions and marriages between European and Africans (Bantu peoples of South Africa and Khoisans) from various tribes, resulting in mixed-race children. As the European colonialists acquired control of territory, they generally pushed the mixed-race and African populations into second-class status. During the first half of the 20th century, the white-dominated government classified the population according to four main racial groups: "Black", "White", "Asian" (mostly Indian), and "Coloured". The Coloured group included people of mixed Bantu, Khoisan, and European ancestry (with some Malay ancestry, especially in the Western Cape). The Coloured definition occupied an intermediary political position between the Black and White definitions in South Africa. It imposed a system of legal racial segregation, a complex of laws known as apartheid. The apartheid bureaucracy devised complex (and often arbitrary) criteria in the Population Registration Act of 1945 to determine who belonged in which group. Minor officials administered tests to enforce the classifications. When it was unclear from a person's physical appearance whether the individual should be considered Coloured or Black, the "pencil test" was used. A pencil was inserted into a person's hair to determine if the hair was kinky enough to hold the pencil, rather than having it pass through, as it would with smoother hair. If so, the person was classified as Black. Such classifications sometimes divided families. Sandra Laing is a South African woman who was classified as Coloured by authorities during the apartheid era, due to her skin colour and hair texture, although her parents could prove at least three generations of European ancestors. At age 10, she was expelled from her all-white school. The officials' decisions based on her anomalous appearance disrupted her family and adult life. She was the subject of the 2008 biographical dramatic film "Skin", which won numerous awards. During the apartheid era, those classed as "Coloured" were oppressed and discriminated against. But, they had limited rights and overall had slightly better socioeconomic conditions than those classed as "Black". The government required that Blacks and Coloureds live in areas separate from Whites, creating large townships located away from the cities as areas for Blacks. In the post-apartheid era, the Constitution of South Africa has declared the country to be a "Non-racial democracy". In an effort to redress past injustices, the ANC government has introduced laws in support of affirmative action policies for Blacks; under these they define "Black" people to include "Africans", "Coloureds" and "Asians". Some affirmative action policies favor "Africans" over "Coloureds" in terms of qualifying for certain benefits. Some South Africans categorized as "African Black" say that "Coloureds" did not suffer as much as they did during apartheid. "Coloured" South Africans are known to discuss their dilemma by saying, "we were not white enough under apartheid, and we are not black enough under the ANC (African National Congress)". In 2008, the High Court in South Africa ruled that Chinese South Africans who were residents during the apartheid era (and their descendants) are to be reclassified as "Black people", solely for the purposes of accessing affirmative action benefits, because they were also "disadvantaged" by racial discrimination. Chinese people who arrived in the country after the end of apartheid do not qualify for such benefits. Other than by appearance, "Coloureds" can usually be distinguished from "Blacks" by language. Most speak Afrikaans or English as a first language, as opposed to Bantu languages such as Zulu or Xhosa. They also tend to have more European-sounding names than Bantu names. Asia. Afro-Asians. "Afro-Asians" or "African-Asians" are persons of mixed sub-Saharan African and Asian ancestry. In the United States, they are also called "black Asians" or "Blasians". Historically, Afro-Asian populations have been marginalized as a result of human migration and social conflict. Western Asia. Arab world. In the medieval Arab world, the ethnic designation of "Black" encompassed not only Zanj, or Africans, but also communities like Zutt, Sindis and Indians from the Indian subcontinent. Historians estimate that between the advent of Islam in 650 CE and the abolition of slavery in the Arabian Peninsula in the mid-20th century, 10 to 18 million black Africans (known as the Zanj) were enslaved by east African slave traders and transported to the Arabian Peninsula and neighboring countries. This number far exceeded the number of slaves who were taken to the Americas. Slavery in Saudi Arabia and slavery in Yemen was abolished in 1962, slavery in Dubai in 1963, and slavery in Oman in 1970. Several factors affected the visibility of descendants of this diaspora in 21st-century Arab societies: The traders shipped more female slaves than males, as there was a demand for them to serve as concubines in harems in the Arabian Peninsula and neighboring countries. Male slaves were castrated in order to serve as harem guards. The death toll of black African slaves from forced labor was high. The mixed-race children of female slaves and Arab owners were assimilated into the Arab owners' families under the patrilineal kinship system. As a result, few distinctive Afro-Arab communities have survived in the Arabian Peninsula and neighboring countries. Distinctive and self-identified black communities have been reported in countries such as Iraq, with a reported 1.2 million black people (Afro-Iraqis), and they attest to a history of discrimination. These descendants of the Zanj have sought minority status from the government, which would reserve some seats in Parliament for representatives of their population. According to Alamin M. Mazrui et al., generally in the Arabian Peninsula and neighboring countries, most of these communities identify as both black and Arab. Iran. Afro-Iranians are people of black African ancestry residing in Iran. During the Qajar dynasty, many wealthy households imported black African women and children as slaves to perform domestic work. This slave labor was drawn exclusively from the Zanj, who were Bantu-speaking peoples that lived along the African Great Lakes, in an area roughly comprising modern-day Tanzania, Mozambique and Malawi. Israel. About 150,000 East African and black people live in Israel, amounting to just over 2% of the nation's population. The vast majority of these, some 120,000, are Beta Israel, most of whom are recent immigrants who came during the 1980s and 1990s from Ethiopia. In addition, Israel is home to more than 5,000 members of the African Hebrew Israelites of Jerusalem movement that are ancestry of African Americans who emigrated to Israel in the 20th century, and who reside mainly in a distinct neighborhood in the Negev town of Dimona. Unknown numbers of black converts to Judaism reside in Israel, most of them converts from the United Kingdom, Canada, and the United States. Additionally, there are around 60,000 non-Jewish African immigrants in Israel, some of whom have sought asylum. Most of the migrants are from communities in Sudan and Eritrea, particularly the Niger-Congo-speaking Nuba groups of the southern Nuba Mountains; some are illegal immigrants. Turkey. Beginning several centuries ago, during the period of the Ottoman Empire, tens of thousands of Zanj captives were brought by slave traders to plantations and agricultural areas situated between Antalya and Istanbul, which gave rise to the Afro-Turk population in present-day Turkey. Some of their ancestry remained "in situ", and many migrated to larger cities and towns. Other black slaves were transported to Crete, from where they or their descendants later reached the İzmir area through the population exchange between Greece and Turkey in 1923, or indirectly from Ayvalık in pursuit of work. Apart from the historical Afro-Turk presence Turkey also hosts a sizeable immigrant black population since the end of the 1990s. The community is composed mostly of modern immigrants from Ghana, Ethiopia, DRC, Sudan, Nigeria, Kenya, Eritrea, Somalia and Senegal. According to official figures 1.5 million Africans live in Turkey and around 25% of them are located in Istanbul. Other studies state the majority of Africans in Turkey lives in Istanbul and report Tarlabaşı, Dolapdere, Kumkapı, Yenikapı and Kurtuluş as having a strong African presence. Most of the African immigrants in Turkey come to Turkey to further migrate to Europe. Immigrants from Eastern Africa are usually refugees, meanwhile Western and Central African immigration is reported to be economically driven. It is reported that African immigrants in Turkey regularly face economic and social challenges, notably racism and opposition to immigration by locals. Southern Asia. The Siddi are an ethnic group inhabiting India and Pakistan. Members are descended from the Bantu peoples of Southeast Africa. Some were merchants, sailors, indentured servants, slaves or mercenaries. The Siddi population is currently estimated at 270,000–350,000 individuals, living mostly in Karnataka, Gujarat, and Hyderabad in India and Makran and Karachi in Pakistan. In the Makran strip of the Sindh and Balochistan provinces in southwestern Pakistan, these Bantu descendants are known as the Makrani. There was a brief "Black Power" movement in Sindh in the 1960s and many Siddi are proud of and celebrate their African ancestry. Southeastern Asia. Negritos, are a collection of various, often unrelated peoples, who were once considered a single distinct population of closely related groups, but genetic studies showed that they descended from the same ancient East Eurasian meta-population which gave rise to modern East Asian peoples, and consist of several separate groups, as well as displaying genetic heterogeneity. They inhabit isolated parts of Southeast Asia, and are now confined primarily to Southern Thailand, the Malay Peninsula, and the Andaman Islands of India. Negrito means "little black people" in Spanish (negrito is the Spanish diminutive of negro, i.e., "little black person"); it is what the Spaniards called the aborigines that they encountered in the Philippines. The term "Negrito" itself has come under criticism in countries like Malaysia, where it is now interchangeable with the more acceptable Semang, although this term actually refers to a specific group. They have dark skin, often curly-hair and Asiatic facial characteristics, and are stockily built. Negritos in the Philippines frequently face discrimination. Because of their traditional hunter-gatherer lifestyle, they are marginalized and live in poverty, unable to find employment. Europe. Western Europe. France. While census collection of ethnic background is illegal in France, it is estimated that there are about 2.5 – 5 million black people residing there. Germany. As of 2020, there are approximately one million black people living in Germany. Netherlands. Afro-Dutch are residents of the Netherlands who are of Black African or Afro-Caribbean ancestry. They tend to be from the former and present Dutch overseas territories of Aruba, Bonaire, Curaçao, Sint Maarten and Suriname. The Netherlands also has sizable Cape Verdean and other African communities. Portugal. As of 2021, there were at least 232,000 people of recent Black-African immigrant background living in Portugal. They mainly live in the regions of Lisbon, Porto, Coimbra. As Portugal doesn't collect information dealing with ethnicity, the estimate includes only people that, as of 2021, hold the citizenship of a Sub Saharan African country or people who have acquired Portuguese citizenship from 2008 to 2021, thus excluding descendants, people of more distant African ancestry or people who have settled in Portugal generations ago and are now Portuguese citizens. Spain. The term "Moors" has been used in Europe in a broader, somewhat derogatory sense to refer to Muslims, especially those of Arab or Berber ancestry, whether living in North Africa or Iberia. Moors were not a distinct or self-defined people. Medieval and early modern Europeans applied the name to Muslim Arabs, Berbers, Sub-Saharan Africans and Europeans alike. Isidore of Seville, writing in the 7th century, claimed that the Latin word Maurus was derived from the Greek "mauron", μαύρον, which is the Greek word for "black". Indeed, by the time Isidore of Seville came to write his "Etymologies", the word Maurus or "Moor" had become an adjective in Latin, "for the Greeks call black, mauron". "In Isidore's day, Moors were black by definition..." Afro-Spaniards are Spanish nationals of West/Central African ancestry. Today, they mainly come from Cameroon, Equatorial Guinea, Ghana, Gambia, Mali, Nigeria and Senegal. Additionally, many Afro-Spaniards born in Spain are from the former Spanish colony Equatorial Guinea. Today, there are an estimated 683,000 Afro-Spaniards in Spain. United Kingdom. According to the Office for National Statistics, at the 2001 census there were more than a million black people in the United Kingdom; 1% of the total population described themselves as "Black Caribbean", 0.8% as "Black African", and 0.2% as "Black other". Britain encouraged the immigration of workers from the Caribbean after World War II; the first symbolic movement was of those who came on the ship the "Empire Windrush" and, hence, those who migrated between 1948 and 1970 are known as the Windrush generation. The preferred official umbrella term is "black, Asian and minority ethnic" (BAME), but sometimes the term "black" is used on its own, to express unified opposition to racism, as in the Southall Black Sisters, which started with a mainly British Asian constituency, and the National Black Police Association, which has a membership of "African, African-Caribbean and Asian origin". Eastern Europe. As African states became independent in the 1960s, the Soviet Union offered many of their citizens the chance to study in Russia. Over a period of 40 years, about 400,000 African students from various countries moved to Russia to pursue higher studies, including many black Africans. This extended beyond the Soviet Union to many countries of the Eastern bloc. Balkans. Due to the slave trade in the Ottoman Empire that had flourished in the Balkans, the coastal town of Ulcinj in Montenegro had its own black community. In 1878, that community consisted of about 100 people. Oceania. Indigenous Australians. Indigenous Australians have been referred to as "black people" in Australia since the early days of European settlement. While originally related to skin colour, the term is used today to indicate Aboriginal or Torres Strait Islander ancestry in general and can refer to people of any skin pigmentation. Being identified as either "black" or "white" in Australia during the 19th and early 20th centuries was critical in one's employment and social prospects. Various state-based Aboriginal Protection Boards were established which had virtually complete control over the lives of Indigenous Australians – where they lived, their employment, marriage, education and included the power to separate children from their parents. Aborigines were not allowed to vote and were often confined to reserves and forced into low paid or effectively slave labour. The social position of mixed-race or "half-caste" individuals varied over time. A 1913 report by Baldwin Spencer states that: After the First World War, however, it became apparent that the number of mixed-race people was growing at a faster rate than the white population, and, by 1930, fear of the "half-caste menace" undermining the White Australia ideal from within was being taken as a serious concern. Cecil Cook, the Northern Territory Protector of Natives, noted that: The official policy became one of biological and cultural assimilation: "Eliminate the full-blood and permit the white admixture to half-castes and eventually the race will become white". This led to different treatment for "black" and "half-caste" individuals, with lighter-skinned individuals targeted for removal from their families to be raised as "white" people and prohibited from speaking their native language and practicing traditional customs, a process now known as the Stolen Generation. The second half of the 20th century to the present has seen a gradual shift towards improved human rights for Aboriginal people. In a 1967 referendum, more than 90% of the Australian population voted to end constitutional discrimination and to include Aborigines in the national census. During this period, many Aboriginal activists began to embrace the term "black" and use their ancestry as a source of pride. Activist Bob Maza said: In 1978, Aboriginal writer Kevin Gilbert received the National Book Council award for his book "Living Black: Blacks Talk to Kevin Gilbert", a collection of Aboriginal people's stories, and in 1998 was awarded (but refused to accept) the Human Rights Award for Literature for "Inside Black Australia", a poetry anthology and exhibition of Aboriginal photography. In contrast to previous definitions based solely on the degree of Aboriginal ancestry, the Government changed the legal definition of Aboriginal in 1990 to include any: This nationwide acceptance and recognition of Aboriginal people led to a significant increase in the number of people self-identifying as Aboriginal or Torres Strait Islander. The reappropriation of the term "black" with a positive and more inclusive meaning has resulted in its widespread use in mainstream Australian culture, including public media outlets, government agencies, and private companies. In 2012, a number of high-profile cases highlighted the legal and community attitude that identifying as Aboriginal or Torres Strait Islander is not dependent on skin color, with a well-known boxer Anthony Mundine being widely criticized for questioning the "blackness" of another boxer and journalist Andrew Bolt being successfully sued for publishing discriminatory comments about Aboriginals with light skin. Melanesians. The region of Melanesia is named from Greek , "black", and , "island", etymologically meaning "islands of black [people]", in reference to the dark skin of the indigenous peoples. Early European settlers, such as Spanish explorer Yñigo Ortiz de Retez, noted the resemblance of the people to those in Africa. Melanesians, along with other Pacific Islanders, were frequently deceived or coerced during the 19th and 20th centuries into forced labour for sugarcane, cotton, and coffee planters in countries distant to their native lands in a practice known as blackbirding. In Queensland, some 55,000 to 62,500 were brought from the New Hebrides, the Solomon Islands, and New Guinea to work in sugarcane fields. Under the Pacific Island Labourers Act 1901, most islanders working in Queensland were repatriated back to their homelands. Those who remained in Australia, commonly called South Sea Islanders, often faced discrimination similarly to Indigenous Australians by white-dominated society. Many indigenous rights activists have South Sea Islander ancestry, including Faith Bandler, Evelyn Scott and Bonita Mabo. Many Melanesians have taken up the term 'Melanesia' as a way to empower themselves as a collective people. Stephanie Lawson writes that the term "moved from a term of denigration to one of affirmation, providing a positive basis for contemporary subregional identity as well as a formal organisation". For instance, the term is used in the Melanesian Spearhead Group, which seeks to promote economic growth among Melanesian countries. Other. John Caesar, nicknamed "Black Caesar", a convict and bushranger with parents born in an unknown area in Africa, was one of the first people of recent black African ancestry to arrive in Australia. At the 2006 Census, 248,605 residents declared that they were born in Africa. This figure pertains to all immigrants to Australia who were born in nations in Africa regardless of race, and includes white Africans. North America. Canada. "Black Canadians" is a designation used for people of black African ancestry who are citizens or permanent residents of Canada. The majority of black Canadians are of Caribbean origin, though the population also consists of African American immigrants and their descendants (including black Nova Scotians), as well as many African immigrants. Black Canadians often draw a distinction between those of Afro-Caribbean ancestry and those of other African roots. The term "African Canadian" is occasionally used by some black Canadians who trace their heritage to the first slaves brought by British and French colonists to the North American mainland. Promised freedom by the British during the American Revolutionary War, thousands of Black Loyalists were resettled by the Crown in Canada afterward, such as Thomas Peters. In addition, an estimated ten to thirty thousand fugitive slaves reached freedom in Canada from the Southern United States during the Antebellum years, aided by people along the Underground Railroad. Many black people of Caribbean origin in Canada reject the term "African Canadian" as an elision of the uniquely Caribbean aspects of their heritage, and instead identify as "Caribbean Canadian". Unlike in the United States, where "African American" has become a widely used term, in Canada controversies associated with distinguishing African or Caribbean heritage have resulted in the term "black Canadian" being widely accepted there. United States. There were eight principal areas used by Europeans to buy and ship slaves to the Western Hemisphere. The number of enslaved people sold to the New World varied throughout the slave trade. As for the distribution of slaves from regions of activity, certain areas produced far more enslaved people than others. Between 1650 and 1900, 10.24 million enslaved West Africans arrived in the Americas from the following regions in the following proportions: By the early 1900s, "nigger" had become a pejorative word in the United States. In its stead, the term "colored" became the mainstream alternative to "negro" and its derived terms. After the American Civil Rights Movement, the terms "colored" and "negro" gave way to "black". "Negro" had superseded "colored" as the most polite word for African Americans at a time when "black" was considered more offensive. This term was accepted as normal, including by people classified as Negroes, until the later Civil Rights movement in the late 1960s. One well-known example is the use by Dr. Rev. Martin Luther King Jr. of "Negro" in his famous speech of 1963, I Have a Dream. During the American civil rights movement of the 1950s and 1960s, some African-American leaders in the United States, notably Malcolm X, objected to the word "Negro" because they associated it with the long history of slavery, segregation, and discrimination that treated African Americans as second-class citizens, or worse. Malcolm X preferred "Black" to "Negro", but later gradually abandoned that as well for "Afro-American" after leaving the Nation of Islam. Since the late 1960s, various other terms for African Americans have been more widespread in popular usage. Aside from "black American", these include "Afro-American" (in use from the late 1960s to 1990) and "African American" (used in the United States to refer to Black Americans, people often referred to in the past as "American Negroes"). In the first 200 years that black people were in the United States, they primarily identified themselves by their specific ethnic group (closely allied to language) and not by skin color. Individuals identified themselves, for example, as Ashanti, Igbo, Bakongo, or Wolof. However, when the first captives were brought to the Americas, they were often combined with other groups from West Africa, and individual ethnic affiliations were not generally acknowledged by English colonists. In areas of the Upper South, different ethnic groups were brought together. This is significant as the captives came from a vast geographic region: the West African coastline stretching from Senegal to Angola and in some cases from the south-east coast such as Mozambique. A new "African-American" identity and culture was born that incorporated elements of the various ethnic groups and of European cultural heritage, resulting in fusions such as the Black church and African-American English. This new identity was based on provenance and slave status rather than membership in any one ethnic group. By contrast, slave records from Louisiana show that the French and Spanish colonists recorded more complete identities of the West Africans, including ethnicities and given tribal names. The U.S. racial or ethnic classification "black" refers to people with all possible kinds of skin pigmentation, from the darkest through to the very lightest skin colors, including albinos, if they are believed by others to have African ancestry (in any discernible percentage). There are also certain cultural traits associated with being "African American", a term used effectively as a synonym for "black person" within the United States. In March 1807, Great Britain, which largely controlled the Atlantic, declared the transatlantic slave trade illegal, as did the United States. (The latter prohibition took effect 1 January 1808, the earliest date on which Congress had the power to do so after protecting the slave trade under of the United States Constitution.) By that time, the majority of black people in the United States were native-born, so the use of the term "African" became problematic. Though initially a source of pride, many blacks feared that the use of African as an identity would be a hindrance to their fight for full citizenship in the United States. They also felt that it would give ammunition to those who were advocating repatriating black people back to Africa. In 1835, black leaders called upon Black Americans to remove the title of "African" from their institutions and replace it with "Negro" or "Colored American". A few institutions chose to keep their historic names, such as the African Methodist Episcopal Church. African Americans popularly used the terms "Negro" or "colored" for themselves until the late 1960s. The term "black" was used throughout but not frequently since it carried a certain stigma. In his 1963 "I Have a Dream" speech, Martin Luther King Jr. uses the terms "negro" fifteen times and "black" four times. Each time that he uses "black", it is in parallel construction with "white"; for example, "black men and white men". With the successes of the American Civil Rights Movement, a new term was needed to break from the past and help shed the reminders of legalized discrimination. In place of "Negro", activists promoted the use of "black" as standing for racial pride, militancy, and power. Some of the turning points included the use of the term "Black Power" by Kwame Ture (Stokely Carmichael) and the popular singer James Brown's song "Say It Loud – I'm Black and I'm Proud". In 1988, the civil rights leader Jesse Jackson urged Americans to use instead the term "African American" because it had a historical cultural base and was a construction similar to terms used by European descendants, such as German American, Italian American, etc. Since then, African American and black have often had parallel status. However, controversy continues over which, if any, of the two terms is more appropriate. Maulana Karenga argues that the term African-American is more appropriate because it accurately articulates their geographical and historical origin. Others have argued that "black" is a better term because "African" suggests foreignness, although black Americans helped found the United States. Still others believe that the term "black" is inaccurate because African Americans have a variety of skin tones. Some surveys suggest that the majority of Black Americans have no preference for "African American" or "black", although they have a slight preference for "black" in personal settings and "African American" in more formal settings. In the U.S. census race definitions, black and African Americans are citizens and residents of the United States with origins in the black racial groups of Africa. According to the Office of Management and Budget, the grouping includes individuals who self-identify as African American, as well as persons who emigrated from nations in the Caribbean and sub-Saharan Africa. The grouping is thus based on geography, and may contradict or misrepresent an individual's self-identification, since not all immigrants from sub-Saharan Africa are "black". The Census Bureau also notes that these classifications are socio-political constructs and should not be interpreted as scientific or anthropological. According to U.S. Census Bureau data, African immigrants generally do not self-identify as African American. The overwhelming majority of African immigrants identify instead with their own respective ethnicities (~95%). Immigrants from some Caribbean, Central American and South American nations and their descendants may or may not also self-identify with the term. Recent surveys of African Americans using a genetic testing service have found varied ancestries that show different tendencies by region and sex of ancestors. These studies found that on average, African Americans have 73.2–80.9% West African, 18–24% European, and 0.8–0.9% Native American genetic heritage, with large variation between individuals. According to studies in the Journal of Personality and Social Psychology, U.S. residents consistently overestimate the size, physical strength, and formidability of young black men. New Great Migration. The New Great Migration is not evenly distributed throughout the South. As with the earlier Great Migration, the New Great Migration is primarily directed toward cities and large urban areas, such as Atlanta, Charlotte, Houston, Dallas, Raleigh, Washington, D.C., Tampa, Virginia Beach, San Antonio, Memphis, Orlando, Nashville, Jacksonville, and so forth. North Carolina's Charlotte metro area in particular, is a hot spot for African American migrants in the US. Between 1975 and 1980, Charlotte saw a net gain of 2,725 African Americans in the area. This number continued to rise as between 1985 and 1990 as the area had a net gain of 7,497 African Americans, and from 1995 to 2000 the net gain was 23,313 African Americans. This rise in net gain points to Atlanta, Charlotte, Dallas, and Houston being a growing hot spots for the migrants of The New Great Migration. The percentage of Black Americans who live in the South has been increasing since 1990, and the biggest gains have been in the region's large urban areas, according to census data. The Black population of metro Atlanta more than doubled between 1990 and 2020, surpassing 2 million in the most recent census. The Black population also more than doubled in metro Charlotte while Greater Houston and Dallas-Fort Worth both saw their Black populations surpass 1 million for the first time. Several smaller metro areas also saw sizable gains, including San Antonio; Raleigh and Greensboro, N.C.; and Orlando. Primary destinations are states that have the most job opportunities, especially Georgia, North Carolina, Maryland, Virginia, Tennessee, Florida and Texas. Other southern states, including Mississippi, Louisiana, South Carolina, Alabama and Arkansas, have seen little net growth in the African American population from return migration. One-drop rule. From the late 19th century, the South used a colloquial term, the "one-drop rule", to classify as black a person of any known African ancestry. This practice of hypodescent was not put into law until the early 20th century. Legally, the definition varied from state to state. Racial definition was more flexible in the 18th and 19th centuries before the American Civil War. For instance, President Thomas Jefferson held in slavery persons who were legally white (less than 25% black) according to Virginia law at the time, but, because they were born to slave mothers, they were born into slavery, according to the principle of "partus sequitur ventrem", which Virginia adopted into law in 1662. Outside of the United States, some other countries have adopted the one-drop rule, but the definition of who is black and the extent to which the one-drop "rule" applies varies greatly from country to country. The one-drop rule may have originated as a means of increasing the number of black slaves and was maintained as an attempt to keep the white race "pure". One of the results of the one-drop rule was the uniting of the African-American community. Some of the most prominent abolitionists and civil-rights activists of the 19th century were multiracial, such as Frederick Douglass, Robert Purvis and James Mercer Langston. They advocated equality for all. Blackness. The concept of blackness in the United States has been described as the degree to which one associates themselves with mainstream African-American culture, politics, and values. To a certain extent, this concept is not so much about race but more about political orientation, culture and behavior. Blackness can be contrasted with "acting white", where black Americans are said to behave with assumed characteristics of stereotypical white Americans with regard to fashion, dialect, taste in music, and possibly, from the perspective of a significant number of black youth, academic achievement. Due to the often political and cultural contours of blackness in the United States, the notion of blackness can also be extended to non-black people. Toni Morrison once described Bill Clinton as the first black President of the United States, because, as she put it, he displayed "almost every trope of blackness". Clinton welcomed the label. The question of blackness also arose in the Democrat Barack Obama's 2008 presidential campaign. Commentators questioned whether Obama, who was elected the first president with black ancestry, was "black enough", contending that his background is not typical because his mother was a white American and his father was a black student visitor from Kenya. Obama chose to identify as black and African American. Mexico. The 2015 preliminary survey to the 2020 census allowed Afro-Mexicans to self-identify for the first time in Mexico and recorded a total of 1.4 million (1.2% of the total Mexican population). The majority of Afro-Mexicans live in the Costa Chica of Guerrero region. Caribbean. Dominican Republic. The first Afro-Dominican slaves were shipped to the Dominican Republic by Spanish conquistadors during the Transatlantic slave trade. Puerto Rico. Spanish conquistadors shipped slaves from West Africa to Puerto Rico. Afro-Puerto Ricans in part trace ancestry to this colonization of the island. South America. Approximately 12 million people were shipped from Africa to the Americas during the Atlantic slave trade from 1492 to 1888. Of these, 11.5 million of those shipped to South America and the Caribbean. Brazil was the largest importer in the Americas, with 5.5 million African slaves imported, followed by the British Caribbean with 2.76 million, the Spanish Caribbean and Spanish Mainland with 1.59 million Africans, and the French Caribbean with 1.32 million. Today their descendants number approximately 150 million in South America and the Caribbean. In addition to skin color, other physical characteristics such as facial features and hair texture are often variously used in classifying peoples as black in South America and the Caribbean. In South America and the Caribbean, classification as black is also closely tied to social status and socioeconomic variables, especially in light of social conceptions of "blanqueamiento" (racial whitening) and related concepts. Brazil. The concept of race in Brazil is complex. A Brazilian child was never automatically identified with the racial type of one or both of their parents, nor were there only two categories to choose from. Between an individual of unmixed West African ancestry and a very light mulatto individual, more than a dozen racial categories were acknowledged, based on various combinations of hair color, hair texture, eye color, and skin color. These types grade into each other like the colors of the spectrum, and no one category stands significantly isolated from the rest. In Brazil, people are classified by appearance, not heredity. Scholars disagree over the effects of social status on racial classifications in Brazil. It is generally believed that achieving upward mobility and education results in individuals being classified as a category of lighter skin. The popular claim is that in Brazil, poor whites are considered black and wealthy blacks are considered white. Some scholars disagree, arguing that "whitening" of one's social status may be open to people of mixed race, a large part of the population known as "pardo", but a person perceived as "preto" (black) will continue to be classified as black regardless of wealth or social status. Statistics. From the years 1500 to 1850, an estimated 3.5 million captives were forcibly shipped from West/Central Africa to Brazil. The territory received the highest number of slaves of any country in the Americas. Scholars estimate that more than half of the Brazilian population is at least in part descended from these individuals. Brazil has the largest population of Afro-ancestry outside Africa. In contrast to the US, during the slavery period and after, the Portuguese colonial government in Brazil and the later Brazilian government did not pass formal anti-miscegenation or segregation laws. As in other Latin American countries, intermarriage was prevalent during the colonial period and continued afterward. In addition, people of mixed race ("pardo") often tended to marry white spouses, and their descendants became accepted as white. As a result, some of the European descended population also has West African or Amerindian blood. According to the last census of the 20th century, in which Brazilians could choose from five color/ethnic categories with which they identified, 54% of individuals identified as white, 6.2% identified as black, and 39.5% identified as pardo (brown)—a broad multi-racial category, including tri-racial persons. In the 19th century, a philosophy of racial whitening emerged in Brazil, related to the assimilation of mixed-race people into the white population through intermarriage. Until recently the government did not keep data on race. However, statisticians estimate that in 1835, roughly 50% of the population was "preto" (black; most were enslaved), a further 20% was "pardo" (brown), and 25% white, with the remainder Amerindian. Some classified as pardo were tri-racial. By the 2000 census, demographic changes including the end to slavery, immigration from Europe and Asia, assimilation of multiracial persons, and other factors resulted in a population in which 6.2% of the population identified as black, 40% as pardo, and 55% as white. Essentially most of the black population was absorbed into the multi-racial category by intermixing. A 2007 genetic study found that at least 29% of the middle-class, white Brazilian population had some recent (since 1822 and the end of the colonial period) African ancestry. Race relations in Brazil. According to the 2022 census, 10.2% of Brazilians said they were black, compared with 7.6% in 2010, and 45.3% said they were racially mixed, up from 43.1%, while the proportion of self-declared white Brazilians has fallen from 47.7% to 43.5%. Activists from Brazil's Black movement attribute the racial shift in the population to a growing sense of pride among African-descended Brazilians in recognising and celebrating their ancestry. The philosophy of the racial democracy in Brazil has drawn some criticism, based on economic issues. Brazil has one of the largest gaps in income distribution in the world. The richest 10% of the population earn 28 times the average income of the bottom 40%. The richest 10 percent is almost exclusively white or predominantly European in ancestry. One-third of the population lives under the poverty line, with blacks and other people of color accounting for 70 percent of the poor. In 2015 United States, African Americans, including multiracial people, earned 76.8% as much as white people. By contrast, black and mixed race Brazilians earned on average 58% as much as whites in 2014. The gap in income between blacks and other non-whites is relatively small compared to the large gap between whites and all people of color. Other social factors, such as illiteracy and education levels, show the same patterns of disadvantage for people of color. Some commentators observe that the United States practice of segregation and white supremacy in the South, and discrimination in many areas outside that region, forced many African Americans to unite in the civil rights struggle, whereas the fluid nature of race in Brazil has divided individuals of African ancestry between those with more or less ancestry and helped sustain an image of the country as an example of post-colonial harmony. This has hindered the development of a common identity among black Brazilians. Though Brazilians of at least partial African heritage make up a large percentage of the population, few blacks have been elected as politicians. The city of Salvador, Bahia, for instance, is 80% people of color, but voters have not elected a mayor of color. Patterns of discrimination against non-whites have led some academic and other activists to advocate for use of the Portuguese term "negro" to encompass all African-descended people, in order to stimulate a "black" consciousness and identity. Colombia. Afro-Colombians are the third-largest African diaspora population in Latin America after Afro-Brazilians and Afro-Haitians. Venezuela. Most black Venezuelans descend from people brought as slaves to Venezuela directly from Africa during colonization; others have been descendants of immigrants from the Antilles and Colombia. Many blacks were part of the independence movement, and several managed to be heroes. There is a deep-rooted heritage of African culture in Venezuelan culture, as demonstrated in many traditional Venezuelan music and dances, such as the Tambor, a musical genre inherited from black members of the colony, or the Llanera music or the Gaita zuliana that both are a fusion of all the three major peoples that contribute to the cultural heritage. Also, black inheritance is present in the country's gastronomy. There are entire communities of blacks in the Barlovento zone, as well as part of the Bolívar state and in other small towns; they also live peaceably among the general population in the rest of Venezuela. Currently, blacks represent a plurality of the Venezuelan population, although many are actually mixed people.
4746
7903804
https://en.wikipedia.org/wiki?curid=4746
Plague (disease)
Plague is an infectious disease caused by the bacterium "Yersinia pestis". Symptoms include fever, weakness and headache. Usually this begins one to seven days after exposure. There are three forms of plague, each affecting a different part of the body and causing associated symptoms. Pneumonic plague infects the lungs, causing shortness of breath, coughing and chest pain; bubonic plague affects the lymph nodes, making them swell; and septicemic plague infects the blood and can cause tissues to turn black and die. The bubonic and septicemic forms are generally spread by flea bites or handling an infected animal, whereas pneumonic plague is generally spread between people through the air via infectious droplets. Diagnosis is typically by finding the bacterium in fluid from a lymph node, blood or sputum. Vaccination is recommended only for people at high risk of exposure to plague; it is not advised for the general population. Those exposed to a case of pneumonic plague may be treated with preventive medication. If infected, treatment is with antibiotics and supportive care. Typically antibiotics include a combination of gentamicin and a fluoroquinolone. The risk of death with treatment is about 10% while without it is about 70%. Globally, about 600 cases are reported a year. In 2017, the countries with the most cases include the Democratic Republic of the Congo, Madagascar and Peru. In the United States, infections occasionally occur in rural areas, where the bacteria are believed to circulate among rodents. It has historically occurred in large outbreaks, with the best known being the Black Death in the 14th century, which resulted in more than 50 million deaths in Europe. Signs and symptoms. There are several different clinical manifestations of plague. The most common form is bubonic plague, followed by septicemic and pneumonic plague. Other clinical manifestations include plague meningitis, plague pharyngitis, and ocular plague. General symptoms of plague include fever, chills, headaches, and nausea. Many people experience swelling in their lymph nodes if they have bubonic plague. For those with pneumonic plague, symptoms may (or may not) include a cough, pain in the chest, and haemoptysis. Bubonic plague. When a flea bites a human and contaminates the wound with regurgitated blood, the plague-causing bacteria are passed into the tissue. "Y. pestis" can reproduce inside cells, so even if phagocytosed, they can still survive. Once in the body, the bacteria can enter the lymphatic system, which drains interstitial fluid. Plague bacteria secrete several toxins, one of which is known to cause beta-adrenergic blockade. "Y. pestis" spreads through the lymphatic vessels of the infected human until it reaches a lymph node, where it causes acute lymphadenitis. The swollen lymph nodes form the characteristic buboes associated with the disease, and autopsies of these buboes have revealed them to be mostly hemorrhagic or necrotic. If the lymph node is overwhelmed, the infection can pass into the bloodstream, causing "secondary septicemic plague" and if the lungs are seeded, it can cause "secondary pneumonic plague". Septicemic plague. Lymphatics ultimately drain into the bloodstream, so the plague bacteria may enter the blood and travel to almost any part of the body. In septicemic plague, bacterial endotoxins cause disseminated intravascular coagulation (DIC), causing tiny clots throughout the body and possibly ischemic necrosis (tissue death due to lack of circulation/perfusion to that tissue) from the clots. DIC results in depletion of the body's clotting resources so that it can no longer control bleeding. Consequently, there is bleeding into the skin and other organs, which can cause red and/or black patchy rash and hemoptysis/hematemesis (coughing up/ vomiting of blood). There are bumps on the skin that look somewhat like insect bites; these are usually red, and sometimes white in the centre. Untreated, the septicemic plague is usually fatal. Early treatment with antibiotics reduces the mortality rate to between 4 and 15 per cent. Pneumonic plague. The pneumonic form of plague arises from infection of the lungs. It causes coughing and thereby produces airborne droplets that contain bacterial cells and are likely to infect anyone inhaling them. The incubation period for pneumonic plague is short, usually two to four days, but sometimes just a few hours. The initial signs are indistinguishable from several other respiratory illnesses; they include headache, weakness, and spitting or vomiting of blood. The course of the disease is rapid; unless diagnosed and treated soon enough, typically within a few hours, death may follow in one to six days; in untreated cases, mortality is nearly 100%. Cause. Transmission of "Y. pestis" to an uninfected individual is possible by any of the following means: "Yersinia pestis" circulates in animal reservoirs, particularly in rodents, in the natural foci of infection found on all continents except Australia. The natural foci of plague are situated in a broad belt in the tropical and sub-tropical latitudes and the warmer parts of the temperate latitudes around the globe, between the parallels 55° N and 40° S. Contrary to popular belief, rats did not directly start the spread of the bubonic plague. It is mainly a disease in the fleas ("Xenopsylla cheopis") that infested the rats, making the rats themselves the first victims of the plague. Rodent-borne infection in a human occurs when a person is bitten by a flea that has been infected by biting a rodent that itself has been infected by the bite of a flea carrying the disease. The bacteria multiply inside the flea, sticking together to form a plug that blocks its stomach and causes it to starve. The flea then bites a host and continues to feed, even though it cannot quell its hunger, and consequently, the flea vomits blood tainted with the bacteria back into the bite wound. The bubonic plague bacterium then infects a new person and the flea eventually dies from starvation. Serious outbreaks of plague are usually started by other disease outbreaks in rodents or a rise in the rodent population. A 21st-century study of a 1665 outbreak of plague in the village of Eyam in England's Derbyshire Dales – which isolated itself during the outbreak, facilitating modern study – found that three-quarters of cases are likely to have been due to human-to-human transmission, especially within families, a much larger proportion than previously thought. Diagnosis. Symptoms of plague are usually non-specific and to definitively diagnose plague, laboratory testing is required. "Y. pestis" can be identified through both a microscope and by culturing a sample and this is used as a reference standard to confirm that a person has a case of plague. The sample can be obtained from the blood, mucus (sputum), or aspirate extracted from inflamed lymph nodes (buboes). If a person is administered antibiotics before a sample is taken or if there is a delay in transporting the person's sample to a laboratory and/or a poorly stored sample, there is a possibility for false negative results. Polymerase chain reaction (PCR) may also be used to diagnose plague, by detecting the presence of bacterial genes such as the "pla" gene (plasmogen activator) and "caf1" gene, (F1 capsule antigen). PCR testing requires a very small sample and is effective for both alive and dead bacteria. For this reason, if a person receives antibiotics before a sample is collected for laboratory testing, they may have a false negative culture and a positive PCR result. Blood tests to detect antibodies against "Y. pestis" can also be used to diagnose plague, however, this requires taking blood samples at different periods to detect differences between the acute and convalescent phases of F1 antibody titres. In 2020, a study about rapid diagnostic tests that detect the F1 capsule antigen (F1RDT) by sampling sputum or bubo aspirate was released. Results show rapid diagnostic F1RDT test can be used for people who have suspected pneumonic and bubonic plague but cannot be used in asymptomatic people. F1RDT may be useful in providing a fast result for prompt treatment and fast public health response as studies suggest that F1RDT is highly sensitive for both pneumonic and bubonic plague. However, when using the rapid test, both positive and negative results need to be confirmed to establish or reject the diagnosis of a confirmed case of plague and the test result needs to be interpreted within the epidemiological context as study findings indicate that although 40 out of 40 people who had the plague in a population of 1000 were correctly diagnosed, 317 people were diagnosed falsely as positive. Prevention. Vaccination. Bacteriologist Waldemar Haffkine developed the first plague vaccine in 1897. He conducted a massive inoculation program in British India, and it is estimated that 26 million doses of Haffkine's anti-plague vaccine were sent out from Bombay between 1897 and 1925, reducing the plague mortality by 50–85%. Since human plague is rare in most parts of the world as of 2023, routine vaccination is not needed other than for those at particularly high risk of exposure, nor for people living in areas with enzootic plague, meaning it occurs at regular, predictable rates in populations and specific areas, such as the western United States. It is not even indicated for most travellers to countries with known recent reported cases, particularly if their travel is limited to urban areas with modern hotels. The United States CDC thus only recommends vaccination for (1) all laboratory and field personnel who are working with "Y. pestis" organisms resistant to antimicrobials: (2) people engaged in aerosol experiments with "Y. pestis"; and (3) people engaged in field operations in areas with enzootic plague where preventing exposure is not possible (such as some disaster areas). A systematic review by the Cochrane Collaboration found no studies of sufficient quality to make any statement on the efficacy of the vaccine. Early diagnosis. Diagnosing plague early leads to a decrease in transmission or spread of the disease. Prophylaxis. Pre-exposure prophylaxis for first responders and health care providers who will care for patients with pneumonic plague is not considered necessary as long as standard and droplet precautions can be maintained. In cases of surgical mask shortages, patient overcrowding, poor ventilation in hospital wards, or other crises, pre-exposure prophylaxis might be warranted if sufficient supplies of antimicrobials are available. Postexposure prophylaxis should be considered for people who had close (<6 feet), sustained contact with a patient with pneumonic plague and were not wearing adequate personal protective equipment. Antimicrobial postexposure prophylaxis also can be considered for laboratory workers accidentally exposed to infectious materials and people who had close (<6 feet) or direct contact with infected animals, such as veterinary staff, pet owners, and hunters. Specific recommendations on pre- and post-exposure prophylaxis are available in the clinical guidelines on treatment and prophylaxis of plague published in 2021. Treatments. If diagnosed in time, the various forms of plague are usually highly responsive to antibiotic therapy. The antibiotics often used are streptomycin, chloramphenicol and tetracycline. Amongst the newer generation of antibiotics, gentamicin and doxycycline have proven effective in monotherapeutic treatment of plague. Guidelines on treatment and prophylaxis of plague were published by the Centers for Disease Control and Prevention in 2021. The plague bacterium could develop drug resistance and again become a major health threat. One case of a drug-resistant form of the bacterium was found in Madagascar in 1995. Further outbreaks in Madagascar were reported in November 2014 and October 2017. Epidemiology. Globally about 600 cases are reported a year. In 2017, the countries with the most cases include the Democratic Republic of the Congo, Madagascar and Peru. It has historically occurred in large outbreaks, with the best known being the Black Death in the 14th century which resulted in more than 50 million dead. In recent years, cases have been distributed between small seasonal outbreaks which occur primarily in Madagascar, and sporadic outbreaks or isolated cases in endemic areas. In 2022 the possible origin of all modern strands of "Yersinia pestis" DNA was found in human remains in three graves located in Kyrgyzstan, dated to 1338 and 1339. The siege of Caffa in Crimea in 1346, is known to have been the first plague outbreak with following strands, later to spread over Europe. Sequencing DNA compared to other ancient and modern strands paints a family tree of the bacteria. Bacteria today affecting marmots in Kyrgyzstan, are closest to the strand found in the graves, suggesting this is also the location where plague transferred from animals to humans. Biological weapon. The plague has a long history as a biological weapon. Historical accounts from ancient China and medieval Europe details the use of infected animal carcasses, such as cows or horses, and human carcasses, by the Xiongnu/Huns, Mongols, Turks and other groups, to contaminate enemy water supplies. Han dynasty general Huo Qubing is recorded to have died of such contamination while engaging in warfare against the Xiongnu. Plague victims were also reported to have been tossed by catapult into cities under siege. In 1347, the Genoese possession of Caffa, a great trade emporium on the Crimean peninsula, came under siege by an army of Mongol warriors of the Golden Horde under the command of Jani Beg. After a protracted siege during which the Mongol army was reportedly withering from the disease, they decided to use the infected corpses as a biological weapon. The corpses were catapulted over the city walls, infecting the inhabitants. This event might have led to the transfer of the Black Death via their ships into the south of Europe, possibly explaining its rapid spread. During World War II, the Japanese Army developed weaponized plague, based on the breeding and release of large numbers of fleas. During the Japanese occupation of Manchuria, Unit 731 deliberately infected Chinese, Korean and Manchurian civilians and prisoners of war with the plague bacterium. These subjects, termed "maruta" or "logs", were then studied by dissection, others by vivisection while still conscious. Members of the unit such as Shirō Ishii were exonerated from the Tokyo tribunal by Douglas MacArthur but 12 of them were prosecuted in the Khabarovsk War Crime Trials in 1949 during which some admitted having spread bubonic plague within a radius around the city of Changde. Ishii innovated bombs containing live mice and fleas, with very small explosive loads, to deliver the weaponized microbes, overcoming the problem of the explosive killing the infected animal and insect by the use of a ceramic, rather than metal, casing for the warhead. While no records survive of the actual usage of the ceramic shells, prototypes exist and are believed to have been used in experiments during WWII. After World War II, both the United States and the Soviet Union developed means of weaponising pneumonic plague. Experiments included various delivery methods, vacuum drying, sizing the bacterium, developing strains resistant to antibiotics, combining the bacterium with other diseases (such as diphtheria), and genetic engineering. Scientists who worked in USSR bio-weapons programs have stated that the Soviet effort was formidable and that large stocks of weaponised plague bacteria were produced. Information on many of the Soviet and US projects is largely unavailable. Aerosolized pneumonic plague remains the most significant threat. The plague can be easily treated with antibiotics. Some countries, such as the United States, have large supplies on hand if such an attack should occur, making the threat less severe.
4748
49931321
https://en.wikipedia.org/wiki?curid=4748
Baudot code
The Baudot code () is an early character encoding for telegraphy invented by Émile Baudot in the 1870s. It was the predecessor to the International Telegraph Alphabet No. 2 (ITA2), the most common teleprinter code in use before ASCII. Each character in the alphabet is represented by a series of five bits, sent over a communication channel such as a telegraph wire or a radio signal by asynchronous serial communication. The symbol rate measurement is known as baud, and is derived from the same name. History. Baudot code (ITA1). In the below table, Columns I, II, III, IV, and V show the code; the Let. and Fig. columns show the letters and numbers for the Continental and UK versions; and the sort keys present the table in the order: alphabetical, Gray and UK Baudot developed his first multiplexed telegraph in 1872 and patented it in 1874. In 1876, he changed from a six-bit code to a five-bit code, as suggested by Carl Friedrich Gauss and Wilhelm Weber in 1834, with equal on and off intervals, which allowed for transmission of the Roman alphabet, and included punctuation and control signals. The code itself was not patented (only the machine) because French patent law does not allow concepts to be patented. Baudot's 5-bit code was adapted to be sent from a manual keyboard, and no teleprinter equipment was ever constructed that used it in its original form. The code was entered on a keyboard which had just five piano-type keys and was operated using two fingers of the left hand and three fingers of the right hand. Once the keys had been pressed, they were locked down until mechanical contacts in a distributor unit passed over the sector connected to that particular keyboard, at which time the keyboard was unlocked ready for the next character to be entered, with an audible click (known as the "cadence signal") to warn the operator. Operators had to maintain a steady rhythm, and the usual speed of operation was 30 words per minute. The table "shows the allocation of the Baudot code which was employed in the British Post Office for continental and inland services. A number of characters in the continental code are replaced by fractionals in the inland code. Code elements 1, 2 and 3 are transmitted by keys 1, 2 and 3, and these are operated by the first three fingers of the right hand. Code elements 4 and 5 are transmitted by keys 4 and 5, and these are operated by the first two fingers of the left hand." Baudot's code became known as the International Telegraph Alphabet No. 1 (ITA1). It is no longer used. Murray code. In 1901, Baudot's code was modified by Donald Murray (1865–1945), prompted by his development of a typewriter-like keyboard. The Murray system employed an intermediate step: an operator used a keyboard perforator to punch a paper tape and then a transmitter to send the message from the punched tape. At the receiving end of the line, a printing mechanism would print on a paper tape, and/or a reperforator would make a perforated copy of the message. Because there was no longer a connection between the operator's hand movement and the bits transmitted, there was no concern about arranging the code to minimize operator fatigue. Instead, Murray designed the code to minimize wear on the machinery by assigning the code combinations with the fewest punched holes to the most frequently used characters. For example, the one-hole letters are E and T. The ten two-hole letters are AOINSHRDLZ, very similar to the "Etaoin shrdlu" order used in Linotype machines. Ten more letters, BCGFJMPUWY, have three holes each, and the four-hole letters are VXKQ. The Murray code also introduced what became known as "format affectors" or "control characters" the CR (Carriage Return) and LF (Line Feed) codes. A few of Baudot's codes moved to the positions where they have stayed ever since: the NULL or BLANK and the DEL code. NULL/BLANK was used as an idle code for when no messages were being sent, but the same code was used to encode the space separation between words. Sequences of DEL codes (fully punched columns) were used at start or end of messages or between them which made it easier to separate distinct messages. (BELL codes could be inserted in those sequences to signal to the remote operator that a new message was coming or that transmission of a message was terminated). Early British Creed machines also used the Murray system. Western Union. Murray's code was adopted by Western Union which used it until the 1950s, with a few changes that consisted of omitting some characters and adding more control codes. An explicit SPC (space) character was introduced, in place of the BLANK/NULL, and a new BEL code rang a bell or otherwise produced an audible signal at the receiver. Additionally, the WRU or "Who aRe yoU?" code was introduced, which caused a receiving machine to send an identification stream back to the sender. ITA2. In 1932, the CCITT introduced the International Telegraph Alphabet No. 2 (ITA2) code as an international standard, which was based on the Western Union code with some minor changes. The US standardized on a version of ITA2 called the American Teletypewriter code (US TTY) which was the basis for 5-bit teletypewriter codes until the debut of 7-bit ASCII in 1963. Some code points (marked blue in the table) were reserved for national-specific usage. The code position assigned to Null was in fact used only for the idle state of teleprinters. During long periods of idle time, the impulse rate was not synchronized between both devices (which could even be powered off or not permanently interconnected on commuted phone lines). To start a message it was first necessary to calibrate the impulse rate, a sequence of regularly timed "mark" pulses (1), by a group of five pulses, which could also be detected by simple passive electronic devices to turn on the teleprinter. This sequence of pulses generated a series of Erasure/Delete characters while also initializing the state of the receiver to the Letters shift mode. However, the first pulse could be lost, so this power on procedure could then be terminated by a single Null immediately followed by an Erasure/Delete character. To preserve the synchronization between devices, the Null code could not be used arbitrarily in the middle of messages (this was an improvement to the initial Baudot system where spaces were not explicitly differentiated, so it was difficult to maintain the pulse counters for repeating spaces on teleprinters). But it was then possible to resynchronize devices at any time by sending a Null in the middle of a message (immediately followed by an Erasure/Delete/LS control if followed by a letter, or by a FS control if followed by a figure). Sending Null controls also did not cause the paper band to advance to the next row (as nothing was punched), so this saved precious lengths of punchable paper band. On the other hand, the Erasure/Delete/LS control code was always punched and always shifted to the (initial) letters mode. According to some sources, the Null code point was reserved for country-internal usage only. The Shift to Letters code (LS) is also usable as a way to cancel/delete text from a punched tape after it has been read, allowing the safe destruction of a message before discarding the punched band. Functionally, it can also play the same filler role as the Delete code in ASCII (or other 7-bit and 8-bit encodings, including EBCDIC for punched cards). After codes in a fragment of text have been replaced by an arbitrary number of LS codes, what follows is still preserved and decodable. It can also be used as an initiator to make sure that the decoding of the first code will not give a digit or another symbol from the figures page (because the Null code can be arbitrarily inserted near the end or beginning of a punch band, and has to be ignored, whereas the Space code is significant in text). The cells marked as reserved for extensions (which use the LS code again a second time—just after the first LS code—to shift from the figures page to the letters shift page) has been defined to shift into a new mode. In this new mode, the letters page contains only lowercase letters, but retains access to a third code page for uppercase letters, either by encoding for a single letter (by sending LS before that letter), or locking (with FS+LS) for an unlimited number of capital letters or digits before then unlocking (with a single LS) to return to lowercase mode. The cell marked as "Reserved" is also usable (using the FS code from the figures shift page) to switch the page of figures (which normally contains digits and "national" lowercase letters or symbols) to a fourth page (where national letters are uppercase and other symbols may be encoded). ITA2 is still used in telecommunications devices for the deaf (TDD), Telex, and some amateur radio applications, such as radioteletype ("RTTY"). ITA2 is also used in Enhanced Broadcast Solution, an early 21st-century financial protocol specified by Deutsche Börse, to reduce the character encoding footprint. Nomenclature. Nearly all 20th-century teleprinter equipment used Western Union's code, ITA2, or variants thereof. Radio amateurs casually call ITA2 and variants "Baudot" incorrectly, and even the American Radio Relay League's Amateur Radio Handbook does so, though in more recent editions the tables of codes correctly identifies it as ITA2. Character set. The values shown in each cell are the Unicode codepoints, given for comparison. Weather code. Meteorologists used a variant of ITA2 with the figures-case symbols, except for the ten digits, BEL and a few other characters, replaced by weather symbols: Details. Note: This table presumes the space called "1" by Baudot and Murray is rightmost, and least significant. The way the transmitted bits were packed into larger codes varied by manufacturer. The most common solution allocates the bits from the least significant bit towards the most significant bit (leaving the three most significant bits of a byte unused). In ITA2, characters are expressed using five bits. ITA2 uses two code sub-sets, the "letter shift" (LTRS), and the "figure shift" (FIGS). The FIGS character (11011) signals that the following characters are to be interpreted as being in the FIGS set, until this is reset by the LTRS (11111) character. In use, the LTRS or FIGS shift key is pressed and released, transmitting the corresponding shift character to the other machine. The desired letters or figures characters are then typed. Unlike a typewriter or modern computer keyboard, the shift key isn't kept depressed whilst the corresponding characters are typed. "ENQuiry" will trigger the other machine's answerback. It means "Who are you?" CR is carriage return, LF is line feed, BEL is the bell character which rang a small bell (often used to alert operators to an incoming message), SP is space, and NUL is the null character (blank tape). Note: the binary conversions of the codepoints are often shown in reverse order, depending on (presumably) from which side one views the paper tape. Note further that the "control" characters were chosen so that they were either symmetric or in useful pairs so that inserting a tape "upside down" did not result in problems for the equipment and the resulting printout could be deciphered. Thus FIGS (11011), LTRS (11111) and space (00100) are invariant, while CR (00010) and LF (01000), generally used as a pair, are treated the same regardless of order by page printers. LTRS could also be used to overpunch characters to be deleted on a paper tape (much like DEL in 7-bit ASCII). The sequence "RYRYRY..." is often used in test messages, and at the start of every transmission. Since R is 01010 and Y is 10101, the sequence exercises much of a teleprinter's mechanical components at maximum stress. Also, at one time, fine-tuning of the receiver was done using two coloured lights (one for each tone). 'RYRYRY...' produced 0101010101..., which made the lights glow with equal brightness when the tuning was correct. This tuning sequence is only useful when ITA2 is used with two-tone FSK modulation, such as is commonly seen in radioteletype (RTTY) usage. US implementations of Baudot code may differ in the addition of a few characters, such as #, & on the FIGS layer. The Russian version of Baudot code (MTK-2) used three shift modes; the Cyrillic letter mode was activated by the character (00000). Because of the larger number of characters in the Cyrillic alphabet, the characters !, &, £ were omitted and replaced by Cyrillics, and BEL has the same code as Cyrillic letter Ю. The Cyrillic letters Ъ and Ё are omitted, and Ч is merged with the numeral 4.
4749
245306
https://en.wikipedia.org/wiki?curid=4749
Blu Tack
Blu Tack is a reusable putty-like pressure-sensitive adhesive produced by Bostik, commonly used to attach lightweight objects (such as posters or sheets of paper) to walls, doors or other dry surfaces. Traditionally blue, it is also available in other colours. Generic versions of the product are also available from other manufacturers. The spelling now used is without a hyphen. As of 2015, Bostik was manufacturing around 100 tonnes of Blu Tack weekly at its Leicester factory. History. Prestik/"Blu Tack" was originally developed in 1969 in Leicester.<ref name="diy.bostik/en-ZA/bostik"></ref> While the inventor<ref name="leicestermercury/1428402"></ref> of the commercial Bostik product is unknown, a precursor was created around 1970 as an accidental by-product of an attempt to develop a sealant based on chalk powder, rubber and oil.<ref name="storyofleicester/bostik"></ref> Blu Tack was originally white, but following fears that children could mistake it for chewing gum, a blue colouring was added. In the United Kingdom in March 2008, 20,000 numbered packs of pink Blu Tack were made available, to help raise money for Breast Cancer Campaign,<ref name="diy.bostik/en-SG/bluhack"></ref> with 10 pence from each pack going to the charity. The formulation was slightly altered to retain complete consistency with its blue counterpart. Since then, many coloured variations have been made, including red-and-white, yellow, and a green Halloween pack. Composition. Blu Tack is described as a synthetic rubber compound without hazardous properties under normal conditions. It can be swallowed without harm and is not carcinogenic. It is non-soluble and is denser than water. The material is not flammable, but emits carbon dioxide and carbon monoxide when exposed to fire or high temperatures. Similar products. Similar products of various colours are made by many manufacturers, including Faber-Castell's "Tack-it", Henkel's "Fun-Tak", UHU's "Poster Putty" and "Sticky Tack", UFO's "Dough Tack", "Gummy Sticker" Pritt's "Sticky Stuff", Bostik's "Prestik" and Elmer's "Poster Tack". Plasti-Tak by Brooks Manufacturing Company appears to pre-date Blu Tack, with a trademark registration in 1964. Versions of the product are also sold under the generic names "adhesive putty" and "mounting putty". The generic trademark or common name for mounting putty varies by region. It is known as "Patafix" in France, Italy, Portugal, Austria and Turkey, ' in Iceland and "lærertyggis" in Norway (both meaning "teacher's chewing gum"), ' ("attachment paste") or ' in Sweden, and ' in South Africa (an Afrikaans word, literally translated as "wonder glue"). Alternative uses. Like all poster putties, Blu Tack provides an alternative to the artist's traditional kneaded eraser. Blu Tack was often used with the Sinclair ZX81 microcomputer to help mitigate crashes caused by wobbly external RAM modules. This was such a widespread problem that Sinclair Research's technical support department officially recommended the use of Blu Tack to resolve this issue. In 2007 the artist Elizabeth Thompson created a sculpture of a house spider using Blu Tack over a wire frame. It was exhibited at London Zoo. Blu Tack can be used as a damping agent for sound and vibration applications,<ref name="tnt-audio/bluetac"></ref> due to its low amplitude response properties. A 2013 study concluded that the substance is a comfortable alternative to over-the-counter ear plugs for the attenuation of everyday sound. The New Zealand Government Earthquake Commission recommends that products such as Blu Tack should be used to prevent ornaments and small household items from falling or moving in the event of an earthquake. Blu Tack is sometimes used by electronic hobbyists to hold through-hole electronic components in position for soldering onto PC-boards.
4751
27823944
https://en.wikipedia.org/wiki?curid=4751
Bacillus
Bacillus, from Latin "bacillus", meaning "little staff, wand", is a genus of Gram-positive, rod-shaped bacteria, a member of the phylum "Bacillota", with 266 named species. The term is also used to describe the shape (rod) of other so-shaped bacteria; and the plural "Bacilli" is the name of the class of bacteria to which this genus belongs. "Bacillus" species can be either obligate aerobes which are dependent on oxygen, or facultative anaerobes which can survive in the absence of oxygen. Cultured "Bacillus" species test positive for the enzyme catalase if oxygen has been used or is present. "Bacillus" can reduce themselves to oval endospores and can remain in this dormant state for years. The endospore of one species from Morocco is reported to have survived being heated to 420 °C. Endospore formation is usually triggered by a lack of nutrients: the bacterium divides within its cell wall, and one side then engulfs the other. They are not true spores (i.e., not an offspring). Endospore formation originally defined the genus, but not all such species are closely related, and many species have been moved to other genera of the "Bacillota". Only one endospore is formed per cell. The spores are resistant to heat, cold, radiation, desiccation, and disinfectants. "Bacillus anthracis" needs oxygen to sporulate; this constraint has important consequences for epidemiology and control. In vivo, "B. anthracis" produces a polypeptide (polyglutamic acid) capsule that kills it from phagocytosis. The genera "Bacillus" and "Clostridium" constitute the family "Bacillaceae". Species are identified by using morphologic and biochemical criteria. Because the spores of many "Bacillus" species are resistant to heat, radiation, disinfectants, and desiccation, they are difficult to eliminate from medical and pharmaceutical materials and are a frequent cause of contamination. Not only are they resistant to heat, radiation, etc., but they are also resistant to chemicals such as antibiotics. This resistance allows them to survive for many years and especially in a controlled environment. "Bacillus" species are well known in the food industries as troublesome spoilage organisms. Ubiquitous in nature, "Bacillus" includes symbiotic (sometimes referred to as endophytes) as well as independent species. Two species are medically significant: "B. anthracis" causes anthrax; and "B. cereus" causes food poisoning. Many species of "Bacillus" can produce copious amounts of enzymes, which are used in various industries, such as in the production of alpha amylase used in starch hydrolysis and the protease subtilisin used in detergents. "B. subtilis" is a valuable model for bacterial research. Some "Bacillus" species can synthesize and secrete lipopeptides, in particular surfactins and mycosubtilins. "Bacillus" species are also found in marine sponges. Marine sponge associated "Bacillus subtilis" (strains WS1A and YBS29) can synthesize several antimicrobial peptides. These "Bacillus subtilis" strains can develop disease resistance in "Labeo rohita". Structure. Cell wall. The cell wall of "Bacillus" is a structure on the outside of the cell that forms the second barrier between the bacterium and the environment, and at the same time maintains the rod shape and withstands the pressure generated by the cell's turgor. The cell wall is made of teichoic and teichuronic acids. "B. subtilis" is the first bacterium for which the role of an actin-like cytoskeleton in cell shape determination and peptidoglycan synthesis was identified and for which the entire set of peptidoglycan-synthesizing enzymes was localized. The role of the cytoskeleton in shape generation and maintenance is important. "Bacillus" species are rod-shaped, endospore-forming aerobic or facultatively anaerobic, Gram-positive bacteria; in some species cultures may turn Gram-negative with age. The many species of the genus exhibit a wide range of physiologic abilities that allow them to live in every natural environment. Only one endospore is formed per cell. The spores are resistant to heat, cold, radiation, desiccation, and disinfectants. Origin of name. The genus "Bacillus" was named in 1835 by Christian Gottfried Ehrenberg, to contain rod-shaped (bacillus) bacteria. He had seven years earlier named the genus "Bacterium". "Bacillus" was later amended by Ferdinand Cohn to further describe them as spore-forming, Gram-positive, aerobic or facultatively anaerobic bacteria. Like other genera associated with the early history of microbiology, such as "Pseudomonas" and "Vibrio", the 266 species of "Bacillus" are ubiquitous. The genus has a very large ribosomal 16S diversity. Isolation and identification. Established methods for isolating "Bacillus" species for culture primarily involve suspension of sampled soil in distilled water, heat shock to kill off vegetative cells leaving primarily viable spores in the sample, and culturing on agar plates with further tests to confirm the identity of the cultured colonies. Additionally, colonies which exhibit characteristics typical of "Bacillus" bacteria can be selected from a culture of an environmental sample which has been significantly diluted following heat shock or hot air drying to select potential "Bacillus" bacteria for testing. Cultured colonies are usually large, spreading, and irregularly shaped. Under the microscope, the "Bacillus" cells appear as rods, and a substantial portion of the cells usually contain oval endospores at one end, making them bulge. Characteristics of "Bacillus" spp.. S.I. Paul et al. (2021) isolated and identified multiple strains of "Bacillus subtilis" (strains WS1A, YBS29, KSP163A, OA122, ISP161A, OI6, WS11, KSP151E, and S8,) from marine sponges of the Saint Martin's Island Area of the Bay of Bengal, Bangladesh. Based on their study, colony, morphological, physiological, and biochemical characteristics of "Bacillus" spp. are shown in the Table below. Note: + = Positive, – =Negative, O= Oxidative, F= Fermentative Phylogeny. It's been long known that the (pre-2020) definition of "Bacillus" is overly vague. One clade, formed by "Bacillus anthracis", "Bacillus cereus", "Bacillus mycoides", "Bacillus pseudomycoides", "Bacillus thuringiensis", and "Bacillus weihenstephanensis" under the 2011 classification standards, should be a single species (within 97% 16S identity), but for medical reasons, they are considered separate species (an issue also present for four species of "Shigella" and "Escherichia coli"). Species. Species orphaned and assigned to other genera: Ecological and clinical significance. "Bacillus" species are ubiquitous in nature, e.g. in soil. They can occur in extreme environments such as high pH ("B. alcalophilus"), high temperature ("B. thermophilus"), and high salt concentrations ("B. halodurans"). They also are very commonly found as endophytes in plants where they can play a critical role in their immune system, nutrient absorption and nitrogen fixing capabilities. "B. thuringiensis" produces a toxin that can kill insects and thus has been used as insecticide. "B. siamensis" has antimicrobial compounds that inhibit plant pathogens, such as the fungi "Rhizoctonia solani" and "Botrytis cinerea", and they promote plant growth by volatile emissions. Some species of "Bacillus" are naturally competent for DNA uptake by transformation. Industrial significance. Many "Bacillus" species are able to secrete large quantities of enzymes. "Bacillus amyloliquefaciens" is the source of a natural antibiotic protein barnase (a ribonuclease), alpha amylase used in starch hydrolysis, the protease subtilisin used with detergents, and the BamH1 restriction enzyme used in DNA research. A portion of the "Bacillus thuringiensis" genome was incorporated into corn and cotton crops. The resulting plants are resistant to some insect pests. "Bacillus subtilis" (natto) is the key microbial participant in the ongoing production of the soya-based traditional natto fermentation, and some "Bacillus" species are on the Food and Drug Administration's GRAS (generally regarded as safe) list. The capacity of selected "Bacillus" strains to produce and secrete large quantities (20–25 g/L) of extracellular enzymes has placed them among the most important industrial enzyme producers. The ability of different species to ferment in the acid, neutral, and alkaline pH ranges, combined with the presence of thermophiles in the genus, has led to the development of a variety of new commercial enzyme products with the desired temperature, pH activity, and stability properties to address a variety of specific applications. Classical mutation and (or) selection techniques, together with advanced cloning and protein engineering strategies, have been exploited to develop these products. Efforts to produce and secrete high yields of foreign recombinant proteins in "Bacillus" hosts initially appeared to be hampered by the degradation of the products by the host proteases. Recent studies have revealed that the slow folding of heterologous proteins at the membrane-cell wall interface of Gram-positive bacteria renders them vulnerable to attack by wall-associated proteases. In addition, the presence of thiol-disulphide oxidoreductases in "B. subtilis" may be beneficial in the secretion of disulphide-bond-containing proteins. Such developments from our understanding of the complex protein translocation machinery of Gram-positive bacteria should allow the resolution of current secretion challenges and make "Bacillus" species preeminent hosts for heterologous protein production. "Bacillus" strains have also been developed and engineered as industrial producers of nucleotides, the vitamin riboflavin, the flavor agent ribose, and the supplement poly-gamma-glutamic acid. With the recent characterization of the genome of "B. subtilis" 168 and of some related strains, "Bacillus" species are poised to become the preferred hosts for the production of many new and improved products as we move through the genomic and proteomic era. Use as model organism. "Bacillus subtilis" is one of the best understood prokaryotes, in terms of molecular and cellular biology. Its superb genetic amenability and relatively large size have provided the powerful tools required to investigate a bacterium from all possible aspects. Recent improvements in fluorescent microscopy techniques have provided novel insight into the dynamic structure of a single cell organism. Research on "B. subtilis" has been at the forefront of bacterial molecular biology and cytology, and the organism is a model for differentiation, gene/protein regulation, and cell cycle events in bacteria.
4752
27015025
https://en.wikipedia.org/wiki?curid=4752
Brasília
Brasília ( , ) is the capital of Brazil and the Federal District. Located in the Brazilian highlands in the country's Central-West region, it was founded by President Juscelino Kubitschek on 21 April 1960, to replace Rio de Janeiro as the national capital. Brasília is Brazil's third-most populous city after São Paulo and Rio de Janeiro, with a population of 2.8 million. Among major Latin American cities, it has the highest GDP per capita. Brasília is a planned city developed by Lúcio Costa, Oscar Niemeyer and Joaquim Cardozo in 1956 in a scheme to move the capital from Rio de Janeiro to a more central location, which was chosen through a committee. The landscape architect was Roberto Burle Marx. The city's design divides it into numbered blocks as well as sectors for specified activities, such as the Hotel Sector, the Banking Sector, and the Embassy Sector. Brasília was inscribed as a UNESCO World Heritage Site in 1987 due to its modernist architecture and uniquely artistic urban planning. It was named "City of Design" by UNESCO in October 2017 and has been part of the Creative Cities Network since then. It is notable for its white-colored, modern architecture, designed by Oscar Niemeyer. All three branches of Brazil's federal government are located in the city: executive, legislative and judiciary. Brasília also hosts 124 foreign embassies. The city's international airport connects it to all other major Brazilian cities and some international destinations, and it is the third-busiest airport in Brazil. It was one of the main host cities of the 2014 FIFA World Cup and hosted some of the football matches during the 2016 Summer Olympics; it also hosted the 2013 FIFA Confederations Cup. Laid out in the shape of an airplane, its "fuselage" is the Monumental Axis, a pair of wide avenues flanking a large park. In the "cockpit" is Praça dos Três Poderes, named for the 3 branches of government surrounding it. Brasília has a unique legal status, as it is an administrative region rather than a municipality like other cities in Brazil. The name "Brasília" is often used as a synonym for the Federal District as a whole, which is divided into 35 administrative regions, one of which (Plano Piloto) includes the area of the originally planned city and its federal government buildings. The entire Federal District is considered by IBGE to make up Brasília's city area, and the local government considers the entirety of the district plus 12 neighboring municipalities in the state of Goiás to be its metropolitan area. Etymology. The term Brasília comes from the Latin translation of Brazil, which was suggested as a name for the country's capital in 1821 by José Bonifácio de Andrada e Silva. History. Background. Brazil's first capital was Salvador; in 1763 Rio de Janeiro became Brazil's capital and remained so until 1960. During this period, resources tended to be centered in Brazil's southeastern region, and most of the country's population was concentrated near its Atlantic coast. Brasilia's geographically central location fostered a more regionally neutral federal capital. An article of the country's first republican constitution, dated 1891, states that the capital should be moved from Rio de Janeiro to a place close to the country's center. The idea of relocating Brazil's capital city was conceived in 1827 by José Bonifácio, an advisor to Emperor Pedro I. He presented a plan to the General Assembly of Brazil for a new city called Brasília, with the idea of moving the capital westward from the heavily populated southeastern corridor. The bill was not enacted because Pedro I dissolved the Assembly. According to a legend, Italian saint Don Bosco in 1883 had a dream in which he described a futuristic city that roughly fitted Brasília's location. In Brasília today, many references to Bosco, who founded the Salesian order, are found throughout the city and one church parish in the city bears his name. Costa plan. Juscelino Kubitschek was elected President of Brazil in 1955. Upon taking office in January 1956, in fulfilment of his campaign pledge, he initiated the planning and construction of the new capital. The following year an international jury selected Lúcio Costa's plan to guide the construction of Brazil's new capital, Brasília. Costa was a student of the famous modernist architect Le Corbusier, and some of modernism's architecture features can be found in his plan. Costa's plan was not as detailed as some of the plans presented by other architects and city planners. It did not include land use schedules, models, population charts or mechanical drawings; however, it was chosen by five out of six jurors because it had the features required to align the growth of a capital city. Even though the initial plan was transformed over time, it oriented much of the construction and most of its features survived. Brasília's accession as the new capital and its designation for the development of an extensive interior region inspired the symbolism of the plan. Costa used a cross-axial design indicating the possession and conquest of this new place with a cross, often likened to a dragonfly, an airplane or a bird. Costa's plan included two principal components, the Monumental Axis (east to west) and the Residential Axis (north to south). The Monumental Axis was assigned political and administrative activities, and is considered the body of the city with the style and simplicity of its buildings, oversized scales, and broad vistas and heights, producing the idea of Monumentality. This axis includes the various ministries, national congress, presidential palace, supreme court building and the television and radio tower. The Residential Axis was intended to contain areas with intimate character and is considered the most important achievement of the plan; it was designed for housing and associated functions such as local commerce, schooling, recreation and churches, constituted of 96 limited to six-story buildings and 12 additional superblocks limited to three-story buildings; Costa's intention with superblocks was to have small self-contained and self-sufficient neighborhoods and uniform buildings with apartments of two or three different categories, where he envisioned the integration of upper and middle classes sharing the same residential area. The urban design of the communal apartment blocks was based on Le Corbusier's Ville Radieuse of 1935, and the superblocks on the North American Radburn layout from 1929. Visually, the blocks were intended to appear absorbed by the landscape because they were isolated by a belt of tall trees and lower vegetation. Costa attempted to introduce a Brazil that was more equitable, he also designed housing for the working classes that was separated from the upper- and middle-class housing and was visually different, with the intention of avoiding slums ("favelas") in the urban periphery. The has been accused of being a space where individuals are oppressed and alienated to a form of spatial segregation. One of the main objectives of the plan was to allow the free flow of automobile traffic, the plan included lanes of traffic in a north–south direction (seven for each direction) for the Monumental Axis and three arterials (the W3, the Eixo and the L2) for the residential Axis; the cul-de-sac access roads of the superblocks were planned to be the end of the main flow of traffic. And the reason behind the heavy emphasis on automobile traffic is the architect's desire to establish the concept of modernity in every level. Though automobiles were invented prior to the 20th century, mass production of vehicles in the early 20th made them widely available; thus, they became a symbol of modernity. The two small axes around the Monumental axis provide loops and exits for cars to enter small roads. Some argue that his emphasis of the plan on automobiles caused the lengthening of distances between centers and it attended only the necessities of a small segment of the population who owned cars. But one can not ignore the bus transportation system in the city. The buses routes inside the city operate heavily on W3 and L2. Almost anywhere, including satellite cities, can be reached just by taking the bus and most of the Plano Piloto can be reached without transferring to other buses. Later, as the population of the city increased, the transportation system also played an important role in mediating the relationship between the Pilot plan and the satellite cities. Due to the larger influx of vehicles, traffic lights were introduced to the Monumental Axis, which violates the concept of modernity and advancement the architect first employed. Additionally, the metro system in Brasília was mainly built for inhabitants of satellite cities. Though this growth has made Brasília no longer a pure utopia with incomparable modernity, the later development of traffic management, bus routes to satellite cities, and the metro system all serve as a remedy to the dystopia, enabling the citizens to enjoy the kind of modernity that was not carefully planned. At the intersection of the Monumental and Residential Axis Costa planned the city center with the transportation center (Rodoviaria), the banking sector and the hotel sector, near to the city center, he proposed an amusement center with theaters, cinemas and restaurants. Costa's Plan is seen as a plan with a sectoral tendency, segregating all the banks, the office buildings, and the amusement center. One of the main features of Costa's plan was that he presented a new city with its future shape and patterns evident from the beginning. This meant that the original plan included paving streets that were not immediately put into use; the advantage of this was that the original plan is hard to undo because he provided for an entire street network, but on the other hand, is difficult to adapt and mold to other circumstances in the future. In addition, there has been controversy with the monumental aspect of Lúcio Costa's Plan, because it appeared to some as 19th century city planning, not modern 20th century in urbanism. An interesting analysis can be made of Brasília within the context of Cold War politics and the association of Lúcio Costa's plan to the symbolism of aviation. From an architectural perspective, the airplane-shaped plan was certainly an homage to Le Corbusier and his enchantment with the aircraft as an architectural masterpiece. However, Brasília was constructed soon after the end of World War II. Despite Brazil's minor participation in the conflict, the airplane shape of the city was key in envisioning the country as part of the newly globalized world, together with the victorious Allies. Furthermore, Brasília is a unique example of modernism both as a guideline for architectural design but also as a principle for organizing society. Modernism in Brasília is explored in James Holston's book, "The Modernist City". Construction. Juscelino Kubitschek, president of Brazil from 1956 to 1961, ordered Brasília's construction, fulfilling the promise of the Constitution and his own political campaign promise. Building Brasília was part of Juscelino's "fifty years of prosperity in five" plan. Already in 1892, the astronomer Louis Cruls, in the service of the Brazilian government, had investigated the site for the future capital. Lúcio Costa won a contest and was the main urban planner in 1957, with 5550 people competing. Oscar Niemeyer was the chief architect of most public buildings, Joaquim Cardozo was the structural engineer, and Roberto Burle Marx was the landscape designer. Brasília was built in 41 months, from 1956 to 21 April 1960, when it was officially inaugurated. Geography. The city sits at an elevation of and more, high on the Brazilian Highlands in the country's center-western region. Paranoá Lake, a large artificial lake, was built to increase the amount of water available and to maintain the region's humidity. It has a marina, and hosts wakeboarders and windsurfers. Diving can also be practiced and one of the main attractions is Vila Amaury, an old village submerged in the lake. This is where the first construction workers of Brasília used to live. Climate. Brasília has a tropical savanna climate ("Aw", according to the Köppen climate classification), milder due to the elevation and with two distinct seasons: the rainy season, from October to April, and the dry season, from May to September. The average temperature is . September, at the end of the dry season, has the highest average maximum temperature, , and July has major and minor lower maximum average temperature, of and , respectively. Average temperatures from September through March are a consistent . With , November is the month with the highest rainfall of the year, while July is the lowest, with only . During the dry season, the city can have very low relative humidity levels, often below 30%. According to the Brazilian National Institute of Meteorology (INMET), the record low temperature was on 18 July 1975, and the record high was on 18 October 2015 and 8 October 2020. The highest accumulated rainfall in 24 hours was on 15 November 1963. Demographics. Ethnic groups. According to the 2022 IBGE Census, 2,817,381 people resided in Brasília and its metropolitan area, of whom 1,370,836 were Mixed (48.7%), 1,126,334 White (40%), 301,765 Black (10.7%), 12,810 Asian (0.5%), and 5,536 Amerindian (0.1%). In 2010, Brasília was ranked the fourth-most populous city in Brazil after São Paulo, Rio de Janeiro, and Salvador. In 2010, the city had 474,871 opposite-sex couples and 1,241 same-sex couples. The population of Brasília was 52.2% female and 47.8% male. In the 1960 census there were almost 140,000 residents in the new Federal district. By 1970 this figure had grown to 537,000. By 2010 the population of the Federal District had surpassed 2.5 million. The city of Brasília proper, the plano piloto was planned for about 500,000 inhabitants, a figure the plano piloto never surpassed, with a current population of only 214,529, but its metropolitan area within the Federal District has grown past this figure. From the beginning, the growth of Brasília was greater than original estimates. According to the original plans, Brasília would be a city for government authorities and staff. However, during its construction, Brazilians from all over the country migrated to the satellite cities of Brasília, seeking public and private employment. At the close of the 20th century, Brasília was the largest city in the world which had not existed at the beginning of the century. Brasília has one of the highest population growth rates in Brazil, with annual growth of 2.82%, mostly due to internal migration. Brasília's inhabitants include a foreign population of mostly embassy workers as well as large numbers of Brazilian internal migrants. Today, the city has important communities of immigrants and refugees. The city's Human Development Index was 0.936 in 2000 (developed level), and the city's literacy rate was around 95.65%. Religion. Christianity is by far the most prevalent religion in Brasília, with Roman Catholicism being the largest denomination. "Source: IBGE 2010. " Government. Brasília does not have a mayor or councillors, because article 32 of the Constitution of Brazil expressly prohibits the Federal District being divided into municipalities. The Federal District is a legal entity of internal public law, which is part of the political-administrative structure of Brazil of a "sui generis" nature, because it is neither a state nor a municipality, but rather a special entity that incorporates the legislative powers reserved to the states and municipalities, as provided in Article 32, § 1º of the Constitution, which gives it a hybrid nature, both state and municipal. The executive power of the Federal District was represented by the mayor of the Federal District until 1969, when the position was transformed into governor of the Federal District. The legislative power of the Federal District is represented by the Legislative Chamber of the Federal District, whose nomenclature includes a mixture of legislative assembly (legislative power of the other units of the federation) and of municipal chamber (legislative of the municipalities). The Legislative Chamber is made up of 24 district deputies. The judicial power which serves the Federal District also serves federal territories as it is constituted, but Brazil does not have any territories. Therefore, the Court of Justice of the Federal District and of the Territories only serves the Federal District. Part of the budget of the Federal District Government comes from the Constitutional Fund of the Federal District. In 2012, the fund totaled 9.6 billion reais. By 2015, the forecast is 12.4 billion reais, of which more than half (6.4 billion) is spent on public security spending. International relations. Brasília is twinned with: Of these, Abuja and Washington, D.C. were also cities specifically planned as the seat of government of their respective countries. Brasília is associated with several significant declarations in the international political and social field, including: Economy. The major roles of construction and of services (government, communications, banking and finance, food production, entertainment, and legal services) in Brasília's economy reflect the city's status as a governmental rather than an industrial center. Industries connected with construction, food processing, and furnishings are important, as are those associated with publishing, printing, and computer software. The gross domestic product (GDP) is divided in Public Administration 54.8%, Services 28.7%, Industry 10.2%, Commerce 6.1%, Agrobusiness 0.2%. Besides being the political center, Brasília is an important economic center. In 2018, it has the third highest GDP of cities in Brazil, R$254 billion reais, representing 3.6% of the total Brazilian GDP. Most economic activity in the federal capital results from its administrative function. Its industrial planning is studied carefully by the Government of the Federal District. Being a city registered by UNESCO, the government in Brasília has opted to encourage the development of non-polluting industries such as software, film, video, and gemology among others, with emphasis on environmental preservation and maintaining ecological balance, preserving the city property. According to Mercer's city rankings of cost of living for expatriate employees, Brasília ranks 45th among the most expensive cities in the world in 2012, up from the 70th position in 2010, ranking behind São Paulo (12th) and Rio de Janeiro (13th). Industries. Industries in the city include construction (Paulo Octavio, Via Construções, and Irmãos Gravia among others); food processing (Perdigão, Sadia); furniture making; recycling (Novo Rio, Rexam, Latasa and others); pharmaceuticals (União Química); and graphic industries. The main agricultural products produced in the city are coffee, guavas, strawberries, oranges, lemons, papayas, soybeans, and mangoes. It has over 110,000 cows and it exports wood products worldwide. The Federal District, where Brasília is located, has a GDP of R$133,4 billion (about US$64.1 billion), about the same as Belarus according to The Economist. Its share of the total Brazilian GDP is about 3.8%. The Federal District has the largest GDP per capita income of Brazil US$25,062, slightly higher than Belarus. The city's planned design included specific areas for almost everything, including accommodation, Hotels Sectors North and South. New hotel facilities are being developed elsewhere, such as the hotels and tourism Sector North, located on the shores of Lake Paranoá. Culture. As a venue for political events, music performances and movie festivals, Brasília is a cosmopolitan city, with around 124 embassies, a wide range of restaurants and a complete infrastructure ready to host any kind of event. Not surprisingly, the city stands out as an important business/tourism destination, which is an important part of the local economy, with dozens of hotels spread around the federal capital. Traditional parties take place throughout the year. In June, large festivals known as "festas juninas" are held celebrating Catholic saints such as Saint Anthony of Padua, Saint John the Baptist, and Saint Peter. On 7 September, the traditional Independence Day parade is held on the Ministries Esplanade. Throughout the year, local, national, and international events are held throughout the city. Christmas is widely celebrated, and New Year's Eve usually hosts major events celebrated in the city. The city also hosts a varied assortment of art works from artists like Bruno Giorgi, Alfredo Ceschiatti, Athos Bulcão, Marianne Peretti, Alfredo Volpi, Di Cavalcanti, Dyllan Taxman, Victor Brecheret and Burle Marx, whose works have been integrated into the city's architecture, making it a unique landscape. The cuisine in the city is very diverse. Many of the best restaurants in the city can be found in the Asa Sul district. The city is the birthplace of Brazilian rock and place of origin of bands like: Legião Urbana, Capital Inicial, Aborto Elétrico, Plebe Rude and Raimundos. Brasília has the Rock Basement Festival which brings new bands to the national scene. The festival is held in the parking of the Brasília National Stadium Mané Garrincha. Since 1965, the annual Brasília Festival of Brazilian Cinema is one of the most traditional cinema festivals in Brazil, being compared only to the Brazilian Cinema Festival of Gramado, in Rio Grande do Sul. The difference between both is that the festival in Brasília still preserves the tradition to only submit and reward Brazilian movies. The International Dance Seminar in Brasília has brought top-notch dance to the Federal Capital since 1991. International teachers, shows with choreographers and guest groups and scholarships abroad are some of the hallmarks of the event. The Seminar is the central axis of the DANCE BRAZIL program and is promoted by the DF State Department of Culture in partnership with the Cultural Association Claudio Santoro. Brasília has also been the focus of modern-day literature. Published in 2008, "The World In Grey: Dom Bosco's Prophecy", by author Ryan J. Lucero, tells an apocalyptical story based on the famous prophecy from the late 19th century by the Italian saint Don Bosco. According to Don Bosco's prophecy: "Between parallels 15 and 20, around a lake which shall be formed; A great civilization will thrive, and that will be the Promised Land". Brasília lies between the parallels 15° S and 20° S, where an artificial lake (Paranoá Lake) was formed. Don Bosco is Brasília's patron saint. "American Flagg!", the First Comics comic book series created by Howard Chaykin, portrays Brasília as a cosmopolitan world capital of culture and exotic romance. In the series, it is a top vacation and party destination. The 2015 Rede Globo series "Felizes para Sempre?" was set in Brasília. Architecture and urbanism. At the Square of Three Powers, Brazilian architect Oscar Niemeyer and Brazilian structural engineer Joaquim Cardozo made buildings in the style of modern Brazilian architecture. The Congress also occupies various other surrounding buildings, some connected by tunnels. The National Congress building is located in the middle of the Eixo Monumental, the city's main avenue. In front lies a large lawn and reflecting pool. The building faces the Praça dos Três Poderes where the Palácio do Planalto and the Supreme Federal Court are located. The Brazilian landscape architect Roberto Burle Marx designed landmark modernist gardens for some of the principal buildings. In residential areas, buildings were built that were inspired in French modernist and bauhaus design. Although not fully accomplished, the "Brasília utopia" has produced a city of relatively high quality of life, in which the citizens live in forested areas with sporting and leisure structure (the "") surrounded by small commercial areas, bookstores and cafés; the city is famous for its cuisine and efficiency of transit. Even these positive features have sparked controversy, expressed in the nickname "ilha da fantasia" ("fantasy island"), indicating the sharp contrast between the city and surrounding regions, marked by poverty and disorganization in the cities of the states of Goiás and Minas Gerais, around Brasília. Critics of Brasilia's grand scale have characterized it as a modernist bauhaus platonic fantasy about the future: Notable structures. The Cathedral of Brasília in the capital of the Federative Republic of Brazil, is an expression of the atheist architect Oscar Niemeyer and the structural engineer Joaquim Cardozo. This concrete-framed hyperboloid structure, seems with its glass roof reaching up, open, to the heavens. The cathedral's structure was finished on 31 May 1970, and only the diameter of the circular area were visible. Niemeyer's and Cardozo's project of Cathedral of Brasília is based in the hyperboloid of revolution which sections are asymmetric. The hyperboloid structure itself is a result of 16 identical assembled concrete columns. There is controversy as to what these columns, having hyperbolic section and weighing 90 t, represent, some say they are two hands moving upwards to heaven, others associate it to the chalice Jesus used in the last supper and some claim it represent his crown of thorns. The cathedral was dedicated on 31 May 1970. At the end of the "Eixo Monumental" ("Monumental Axis") lies the "Esplanada dos Ministérios" ("Ministries Esplanade"), an open area in downtown Brasília. The rectangular lawn is surrounded by two eight-lane avenues where many government buildings, monuments and memorials are located. On Sundays and holidays, the Eixo Monumental is closed to cars so that locals may use it as a place to walk, bike, and have picnics under the trees. "Praça dos Três Poderes" (Portuguese for "Square of the Three Powers") is a plaza in Brasília. The name is derived from the encounter of the three federal branches around the plaza: the Executive, represented by the Palácio do Planalto (presidential office); the Legislative, represented by the National Congress (Congresso Nacional); and the Judiciary branch, represented by the Supreme Federal Court (Supremo Tribunal Federal). It is a tourist attraction in Brasília, designed by Lúcio Costa and Oscar Niemeyer as a place where the three branches would meet harmoniously. The Palácio da Alvorada is the official residence of the president of Brazil. The palace was designed, along with the rest of the city of Brasília, by Oscar Niemeyer and inaugurated in 1958. One of the first structures built in the republic's new capital city, the "Alvorada" lies on a peninsula at the shore of Lake Paranoá. The principles of simplicity and modernity that in the past characterized the great works of architecture motivated Niemeyer. The viewer has an impression of looking at a glass box, softly landing on the ground with the support of thin external columns. The building has an area of 7,000 m2 with three floors consisting of the basement, landing, and second floor. The auditorium, kitchen, laundry, medical center, and administration offices are at basement level. The rooms used by the presidency for official receptions are on the landing. The second floor has four suites, two apartments, and various private rooms which make up the residential part of the palace. The building also has a library, a heated Olympic-sized swimming pool, a music room, two dining rooms and various meeting rooms. A chapel and heliport are in adjacent buildings. The Palácio do Planalto is the official workplace of the president of Brazil. It is located at the Praça dos Três Poderes in Brasília. As the seat of government, the term "Planalto" is often used as a metonym for the executive branch of government. The main working office of the President of the Republic is in the Palácio do Planalto. The President and his or her family do not live in it, rather in the official residence, the Palácio da Alvorada. Besides the President, senior advisors also have offices in the "Planalto", including the Vice-President of Brazil and the Chief of Staff. The other Ministries are along the Esplanada dos Ministérios. The architect of the Palácio do Planalto was Oscar Niemeyer, creator of most of the important buildings in Brasília. The idea was to project an image of simplicity and modernity using fine lines and waves to compose the columns and exterior structures. The Palace is four stories high, and has an area of 36,000 m2. Four other adjacent buildings are also part of the complex. Education. The education factor of Brasília's Human Development Index in 2020 reached the mark of 0.804 - a considerably high level, in line with the standards of the United Nations Development Programme (UNDP) - while the literacy rate of the population over the age of ten indicated by the last demographic census was 96.7%, above the national average (91%). The city has seven international schools: the American School of Brasília, the Brasília International School, the Escola das Nações, the Swiss International School, the Lycée Français François Mitterrand, the Maple Bear Canadian School, and the British School of Brasília. Brasília has two universities, three university centers, and many private colleges. The main tertiary educational institutions are: Universidade de Brasília – University of Brasília (UnB) (public); Universidade Católica de Brasília – Catholic University of Brasília; Centro Universitário de Brasília (UniCEUB); Centro Universitário Euroamaricano (Unieuro); ; ; and Instituto de Educação Superior de Brasília. There is an extreme concentration of higher education institutions in the Plano Piloto. In 2006, a new campus of the University of Brasilia was set up in Planaltina. There are also UnB campuses in the administrative regions of Ceilândia and Gama. The number of libraries is not proportional to the size of the population in the central area. The main public libraries in the Federal District are located in the city center, such as the University of Brasilia Library, the House and Senate Library, the Brasilia Demonstration Library and the Leonel de Moura Brizola National Library, also known as the Brasilia National Library, which opened in 2006. Transportation. The average commute time on public transit in Brasília, for example to and from work, on a weekday is 96 min. 31% of public transit riders, ride for more than 2 hours every day. The average amount of time people wait at a stop or station for public transit is 28 min, while 61% of riders wait for over 20 minutes on average every day. The average distance people usually ride in a single trip with public transit is , while 50% travel for over in a single direction. Airport. Brasília–Presidente Juscelino Kubitschek International Airport serves the metropolitan area with major domestic and international flights. It is the third busiest Brazilian airport based on passengers and aircraft movements. Because of its strategic location it is a civil aviation hub for the rest of the country. This results in a large number of takeoffs and landings and it is not unusual for flights to be delayed in a holding pattern before landing. Following the airport's master plan, Infraero built a second runway, which was finished in 2006. In 2007, the airport handled 11,119,872 passengers. The main building's third floor, with 12 thousand square meters, has a panoramic deck, a food court, shops, four movie theaters with total capacity of 500 people, and space for exhibitions. Brasília Airport has 136 vendor spaces. The airport is located about from the central area of Brasília, outside the metro system. The area outside the airport's main gate is lined with taxis as well as several bus line services that connect the airport to Brasília's central district. The parking lot accommodates 1,200 cars. The airport is serviced by domestic and regional airlines (TAM, GOL, Azul, WebJET, Trip and Avianca), in addition to a number of international carriers. In 2012, Brasília's International Airport was won by the InfraAmerica consortium, formed by the Brazilian engineering company ENGEVIX and the Argentine Corporacion America holding company, with a 50% stake each. During the 25-year concession, the airport may be expanded to up to 40 million passengers a year. In 2014 the airport received 15 new boarding bridges, totaling 28 in all. This was the main requirement made by the federal government, which transferred the operation of the terminal to the Inframerica Group after an auction. The group invested R$750 million in the project. In the same year, the number of parking spaces doubled, reaching three thousand. The airport's entrance has a new rooftop cover and a new access road. Furthermore, a VIP room was created on Terminal 1's third floor. The investments increased the airport's capacity from approximately 15 million passengers per year to 21 million by 2014. Brasília Air Force Base - ALA1, one of their most important bases of the Brazilian Air Force, is located in Brasília. Road transport. Like most Brazilian cities, Brasília has a good network of taxi companies. Taxis from the airport are available outside the terminal, but at times there can be quite a queue of people. Although the airport is not far from the downtown area, taxi prices do seem to be higher than in other Brazilian cities. Booking in advance can be advantageous, particularly if time is limited, and local companies should be able to assist airport transfer or transport requirements. The Juscelino Kubitschek bridge, also known as the 'President JK Bridge' or the 'JK Bridge', crosses Lake Paranoá in Brasília. It is named after Juscelino Kubitschek, former president of Brazil. It was designed by architect Alexandre Chan and structural engineer Mário Vila Verde. Chan won the Gustav Lindenthal Medal for this project at the 2003 International Bridge Conference in Pittsburgh due to "...outstanding achievement demonstrating harmony with the environment, aesthetic merit and successful community participation". It consists of three tall asymmetrical steel arches that crisscross diagonally. With a length of 1,200 m (0.75 miles), it was completed in 2002 at a cost of US$56.8 million. The bridge has a pedestrian walkway and is accessible to bicyclists and skaters. The main bus hub in Brasília is the Central Bus Station, located in the crossing of the Eixo Monumental and the Eixão, about from the Three Powers Plaza. The original plan was to have a bus station as near as possible to every corner of Brasília. Today, the bus station is the hub of urban buses only, some running within Brasília and others connecting Brasília to the satellite cities. In the original city plan, the interstate buses would also stop at the Central Station. Because of the growth of Brasília (and corresponding growth in the bus fleet), today the interstate buses leave from the older interstate station (called Rodoferroviária) located at the western end of the Eixo Monumental. The Central Bus Station also contains a main metro station. A new bus station was opened in July 2010. It is on Saída Sul (South Exit) near Parkshopping Mall with its metro station, and is also an inter-state bus station, used only to leave the Federal District. Metro. The Federal District Metro is Federal District's underground metro system. The system has 24 stations on two lines, the Orange and Green lines, along a total network of , covering some of the Federal District. Both lines begin at the Central Station and run parallel until the Águas Claras Station. The Federal District Metro is not comprehensive so buses may provide better access to the center. The metro leaves the Rodoviária (bus station) and goes south, avoiding most of the political and tourist areas. The main purpose of the metro is to serve cities, such as Samambaia, Taguatinga and Ceilândia, as well as Guará and Águas Claras. The satellite cities served are more populated in total than the Plano Piloto itself (the census of 2000 indicated that Ceilândia had 344,039 inhabitants, Taguatinga had 243,575, and the Plano Piloto had approximately 400,000 inhabitants), and most residents of the satellite cities depend on public transportation. The Expresso Pequi high-speed railway was planned to link Brasília and Goiânia, the capital of the state of Goias, but it will probably be turned into a regional service linking the capital cities and cities in between, including Anápolis and Alexânia. A 22 km light rail line is also planned, estimated to cost between 1 billion reais (US$258 million) and 1.5 billion reais with capacity to transport around 200,000 passengers per day. Sport. The main stadiums are Estadio Nacional Mané Garrincha (re-inaugurated on 18 May 2013), the Serejão Stadium (home for Brasiliense) and the Bezerrão Stadium (home for Gama). There are a couple of professional clubs playing in Brasília, like Brasiliense, Gama, Brasília FC, and Real Brasília, but none are currently in the top two levels of Brazilian football. Clubs from Rio de Janeiro and São Paulo remain the most supported throughout the city. Brasília was one of the host cities of the 2014 FIFA World Cup and 2013 FIFA Confederations Cup. Brasília hosted the opening of the Confederations Cup and hosted 7 World Cup games. Brasília also hosted the football tournaments during the 2016 Summer Olympics held in Rio de Janeiro. Brasília is known as a departing point for the practice of unpowered air sports, sports that may be practiced with hang gliding or paragliding wings. Practitioners of such sports reveal that, because of the city's dry weather, the city offers strong thermal winds and great "cloud-streets", which is also the name for a maneuver quite appreciated by practitioners. In 2003, Brasília hosted the 14th Hang Gliding World Championship, one of the categories of free flying. In August 2005, the city hosted the second stage of the Brazilian Hang Gliding Championship. Brasília is the site of the Autódromo Internacional Nelson Piquet which hosted the Grande Prêmio Presidente Emílio Médici, a non-championship round of the 1974 Formula One Grand Prix season, which was won by Emerson Fittipaldi. An IndyCar race was cancelled at the last minute in 2015 due to financial concerns. The track, which has been closed since 2015, is being renovated for the end of 2023 after a deal was struck with Banco de Brasília and Terracap. The city is also home to one of Brazil's top basketball clubs, the three-time NBB champion Uniceub BRB. The club hosts some of its games at the 16,000 all-seat Nilson Nelson Gymnasium. Brasília attempted to host the 2000 Summer Olympics, but withdrew its application. In tennis, Brasília is host to the Aberto da República, and formerly hosted the Aberto de Brasília. See also. Purpose-built Brazilian state capitals
4754
437762
https://en.wikipedia.org/wiki?curid=4754
Blue Streak (missile)
The de Havilland Propellers Blue Streak was a British Intermediate-range ballistic missile (IRBM), and later the first stage of the Europa satellite launch vehicle. Blue Streak was cancelled without entering full production. The project was intended to maintain an independent British nuclear deterrent, replacing the V bomber fleet which would become obsolete by 1965. The operational requirement for the missile was issued in 1955 and the design was complete by 1957. During development, it became clear that the missile system was too expensive and too vulnerable to a surprise attack. The missile project was cancelled in 1960, with US-led Skybolt the preferred replacement. Partly to avoid political embarrassment from the cancellation, the UK government proposed that the rocket be used as the first stage of a civilian satellite launcher called Black Prince. As the cost was thought to be too great for the UK alone, international collaboration was sought. This led to the formation of the European Launcher Development Organisation (ELDO), with Blue Streak used as the first stage of a carrier rocket named Europa. Europa was tested at Woomera Test Range, Australia and later at Kourou in French Guiana. Following launch failures, the ELDO project was cancelled in 1972 and Blue Streak with it. Background. Post-war Britain's nuclear weapons armament was initially based on free-fall bombs delivered by the V bomber force. It soon became clear that if Britain wanted to have a credible nuclear deterrent threat, a ballistic missile was essential. There was a political need for an independent deterrent, so that Britain could remain a major world power. Britain was unable to purchase American weapons wholesale due to the restrictions of the Atomic Energy Act of 1946. In April 1954 the Americans proposed a joint development programme for ballistic missiles. The United States would develop an intercontinental ballistic missile (ICBM) of range (SM-65 Atlas), while the United Kingdom with United States support would develop an Intermediate-range ballistic missile (IRBM) of range. The proposal was accepted as part of the Wilson-Sandys Agreement of August 1954, which provided for collaboration, exchange of information, and mutual planning of development programmes. The decision to develop was influenced by what could be learnt about missile design and development in the US. Initial requirements for the booster were made by the Royal Aircraft Establishment at Farnborough with input on the rocket engine design from the Rocket Propulsion Establishment at Westcott. Operational Requirement 1139 demanded a rocket of at least range and the initially proposed rocket would have just reached that threshold. Development. The de Havilland Propellers company won the contract to build the missile, which was to be powered by an uprated liquid-fuelled Rocketdyne S-3D engine, developed by Rolls-Royce, called RZ.2. Two variants of this engine were developed: the first provided a static thrust of and the second (intended for the three-stage satellite launch vehicle) . The engines could be vectored by seven degrees in flight and were used to guide the missile. This configuration, however, put considerable pressure on the autopilot which had to cope with the problem of a vehicle whose weight was diminishing rapidly and that was steered by large engines whose thrust remained more or less constant. Vibration was also a problem, particularly at engine cut-off, and the later development of the autopilot for the satellite launcher was, in itself, a considerable achievement. Subcontractors included the Sperry Gyroscope Company who produced the missile guidance system whilst the nuclear warhead was designed by the Atomic Weapons Research Establishment at Aldermaston. The missiles used liquid oxygen and kerosene propellants. Whilst the vehicle could be left fully laden with over 20 tonnes of kerosene, the 60 tonnes of liquid oxygen had to be loaded immediately before launch or icing became a problem. Due to this, fuelling the rocket took 4.5 minutes, which would have made it useless as a rapid response to an attack. The missile was vulnerable to a pre-emptive nuclear strike, launched without warning or in the absence of any heightening of tension sufficient to warrant readying the missile. To negate this problem de Havilland created a stand-by feature. A missile could be held at 30 seconds' notice to launch for ten hours. As the missiles were to be deployed in pairs and it took ten hours for one missile to be prepared for stand-by, one of the two missiles could always be ready for rapid launch. To protect the missiles against a pre-emptive strike while being fuelled, the idea of sitting the missiles in underground launchers was developed. These would have been designed to withstand a one megaton blast at a distance of and were a British innovation, subsequently exported to the United States. Finding sites for these silos proved extremely difficult. RAF Spadeadam in Cumberland (now Cumbria) was the only site where construction was started on a full scale underground launcher, although test borings were undertaken at a number of other locations. The remains of this test silo, known as U1, were rediscovered by tree felling at Spadeadam in 2004. This was also the site where the RZ.2 rocket engines and also the complete Blue Streak missile were tested. The best sites for silo construction were the more stable rock strata in parts of southern and north-east England and eastern Scotland, but the construction of many underground silos in the countryside carried enormous economic, social, and political costs. Development of the underground launchers presented a major technical challenge. 1/60- and 1/6-scale models based on a U-shaped design were constructed and tested at RPE Westcott. Three alternative designs were drawn up with one chosen as the prototype, designated K11. RAF Upavon would appear to have been the preferred location for the prototype operational launcher with the former RNAS at Crail as the likely first operational site. In 1955–1956, the rocket motors were test-fired at High Down Rocket Test Site on the Isle of Wight. As no site in Britain provided enough space for test flights, a test site was established at Woomera, South Australia. Cancellation as a military project. Doubts arose as the cost escalated from the first tentative figure of £50 million submitted to the Treasury in early 1955, to £300 million in late 1959. Its detractors in the civil service claimed that the programme was crawling along when compared with the speed of development in the US and the Soviet Union. Estimates within the Civil Service for completion of the project ranged from a total spend of £550 million to £1.3 billion, as different ministers were set on either abandoning or continuing the project. The project was unexpectedly cancelled in April 1960. Whitehall opposition grew, and it was cancelled on the ostensible grounds that it would be too vulnerable to a first-strike attack. Admiral of the Fleet Lord Mountbatten had spent considerable effort arguing that the project should be cancelled at once in favour of the Navy being armed with nuclear weapons, capable of pre-emptive strike. Some considered the cancellation of Blue Streak to be not only a blow to British military-industrial efforts, but also to Commonwealth ally Australia, which had its own vested interest in the project. The British military transferred its hopes for a strategic nuclear delivery system to the Anglo-American Skybolt missile, before the project's cancellation by the United States as its ICBM programme reached maturity. The British instead purchased the Polaris system from the Americans, carried in British-built submarines. Civilian programmes - Black Prince and ELDO. After the cancellation as a military project, there was reluctance to cancel the project because of the huge cost incurred. Blue Streak would have become the first stage of a projected all British satellite launcher known as "Black Prince": the second stage was derived from the "Black Knight" test vehicle, and the orbital injection stage was a small hydrogen peroxide/kerosene motor. Black Prince proved too expensive for the UK, and the European Launcher Development Organisation (ELDO) was set up. This used Blue Streak as the first stage, with French and German second and third stages. The Blue Streak first stage was successfully tested three times at the Woomera test range in Australia as part of the ELDO programme. Black Prince. In 1959, a year before the cancellation of the Blue Streak as a missile, the government requested that the RAE and Saunders-Roe design a carrier rocket based on Blue Streak and Black Knight. This design used Blue Streak as a first stage and a second stage based on the Black Knight. Several different third stages would be available, depending on the required payload and orbit. The cost of developing Black Prince was estimated to be £35 million. It was planned that Black Prince would be a Commonwealth project. As the government of John Diefenbaker in Canada was already spending more money than publicly acknowledged on Alouette and Australia was not interested in the project, these two countries were unwilling to contribute. South Africa was no longer a member of the Commonwealth. New Zealand was only likely to make "modest" contributions. European Launcher Development Organisation. The UK instead proposed a collaboration with other European countries to build a three-stage launcher capable of placing a one-ton payload into low Earth orbit. The European Launcher Development Organisation consisted of Belgium, Britain, France, West Germany, Italy and the Netherlands, with Australia as an associate member. Preliminary work began in 1962 and ELDO was formally signed into existence in 1964. With Blue Streak, the UK became the first stage of the European launch vehicle with France providing the Coralie second stage and Germany the third. Italy worked on the satellite project, the Netherlands and Belgium concentrated on tracking and telemetry systems and Australia supplied the launch site. The combined launcher was named Europa. After ten test launches, the Woomera launch site was not suitable for putting satellites into geosynchronous orbit, and in 1966 it was decided to move to the French site of Kourou in South America. F11 was fired from here in November 1971, but the failure of the autopilot caused the vehicle to break up. The launch of F12 was postponed whilst a project review was carried out, which led to the decision to abandon the Europa design. ELDO was merged with the European Space Research Organisation to form the European Space Agency. Related projects. Aside from Black Prince, a range of other proposals was made between 1959 and 1972 for a carrier rocket based on Blue Streak, but none of these were ever built in full and today only exist in design. De Havilland/British Interplanetary Society proposal. In 1959 de Havilland suggested solving the problem of the Blue Streak/Black Knight geometry by compressing the 10 by 1 metre (30 by 3-foot) Black Knight into a sphere. Although this seemed logical, the development costs proved to be too high for the limited budget of the programme. Westland Helicopters Black Arrow. Following its merger with Saunders Roe, Westland Helicopters developed the three-stage Black Arrow satellite carrier rocket, derived from the Black Knight test vehicle. The first stage of Black Arrow was given the same diameter as the French Coralie (the second stage of Europa) to make it compatible with Blue Streak. Using Blue Streak as an additional stage would have increased Black Arrow's payload capacity. To maintain this compatibility, the first stage diameter was given in metres, although the rest of the rocket was defined in imperial units. Black Arrow carried out four test launches (without an additional Blue Streak stage) from Woomera between 1969 and 1971, with the final launch carrying the satellite Prospero X-3 into orbit. The United Kingdom remains the only country to have developed and then abandoned a satellite launch capability. Hawker Siddeley Dynamics proposal. In 1972, Hawker Siddeley Dynamics ltd produced a brochure for a design using Blue Streak as the first stage of a two-stage to orbit rocket, with an American Centaur upper stage. The Centaur second stage would have either been built in the UK under licence or imported directly from the USA. Both the Centaur and Blue Streak had proved to be very reliable up to this point, and since they were both already designed development costs would have been low. Furthermore, it had a payload of 870–920 kg to a geosynchronous orbit with, and 650–700 kg without the use of additional booster rockets. Blue Streak today. Following the cancellation of the Blue Streak project some of the remaining rockets were preserved at: A section of the propulsion bay, engines and equipment can be found at the Solway Aviation Museum, Carlisle Lake District Airport. Only a few miles from the Spadeadam testing site, the museum carries many exhibits, photographs and models of the Blue Streak programme, having inherited the original Spadeadam collection that used to be displayed on site. RZ.2 engines are on display at National Space Centre – a pair on cradles alongside the Blue Streak rocket – and at the Armagh Planetarium in Northern Ireland and The Euro Space Center in Redu, Belgium. Blue Streak enthusiast Robin Joseph from the United Kingdom has a collection of parts including start systems and combustion chambers amongst other things. He can often be seen displaying his collection at space days in the West Midlands. A part of the Blue Streak F1 rocket launched on 5 June 1964 from Woomera, Australia, found 50 km SE of Giles in 1980 (c.1000 km) is on display at Giles Weather Station. The titanium structure of a German third stage was, for some time, sited on the edge of a gravel pit in Gloucestershire. Remains of Blue Streak F4, launched on 24 May 1965, are on display at Woomera. Footage from the Blue Streak launch was briefly incorporated into "The Prisoner"s final episode, "Fall Out". It was also used in the "Doctor Who" serial "The Tenth Planet", treated within the story as the launch of the "Zeus IV" spacecraft. Images of the Blue Streak 1 are incorporated in the 1997 film "Contact".
4756
82835
https://en.wikipedia.org/wiki?curid=4756
Bakassi
Bakassi is a peninsula on the Gulf of Guinea. It lies between the Cross River estuary, near the city of Calabar and the Rio del Ray estuary on the east. It is governed by Cameroon, following the transfer of sovereignty from neighbouring Nigeria as a result of a judgment by the International Court of Justice. On 22 November 2007, the Nigerian Senate rejected the transfer, since the Greentree Agreement ceding the area to Cameroon was contrary to Section 12(1) of the 1999 Constitution. Regardless, the territory was completely ceded to Cameroon on 14 August 2008, exactly two years after the first part of it was transferred. Geography and economy. The peninsula lies between latitudes 4°25′ and 5°10′N and longitudes 8°20′ and 9°08′E . It consists of a number of low-lying, largely mangrove covered islands covering an area of around . The population of Bakassi is the subject of some dispute, but is generally put at between 150,000 and 300,000 people. Bakassi is situated at the extreme eastern end of the Gulf of Guinea, where the warm east-flowing Guinea Current (called Aya Efiat in Efik) meets the cold north-flowing Benguela Current (called Aya Ubenekang in Efik). These two ocean currents interact, creating huge foamy breakers which constantly advance towards the shore. This builds up shoals rich in fish, shrimps, and a wide variety of other marine life forms, thus making the area a very fertile fishing ground, from which most of the population make their living. The peninsula is commonly described as "oil-rich", though in fact no commercially viable deposits of oil have been discovered. However, the area has aroused considerable interest from oil companies in the light of the discovery of rich reserves of high grade crude oil in Nigeria. At least eight multinational oil companies have participated in the exploration of the peninsula and its offshore waters. In October 2012, China Petroleum & Chemical Corporation announced it had discovered new oil and gas resources in the Bakassi region. History. During the Scramble for Africa, Queen Victoria signed a Treaty of Protection with the King and Chiefs of Akwa Akpa, known to Europeans as Old Calabar, on 10 September 1884. This enabled the British Empire to exercise control over the entire territory around Calabar, including Bakassi. The territory subsequently became "de facto" part of Nigeria, although the border was never permanently delineated. However, documents released by the Cameroonians, in parity with that of the British and Germans, clearly places Bakassi under Cameroonian Territory as a consequence of colonial era Anglo-German agreements. After Southern Cameroons voted in 1961 to leave Nigeria and became a part of Cameroon, Bakassi remained under Calabar administration in Nigeria until ICJ judgement of 2002. Population. Bakassi inhabitants are mainly the Oron people, the people of Cross River State and Akwa Ibom State of Nigeria. Territorial dispute. Nigeria and Cameroon have disputed the possession of Bakassi for some years, leading to considerable tension between the two countries. In 1981 the two countries went to the brink of war over Bakassi and another area around Lake Chad, at the other end of the two countries' common border. More armed clashes broke out in the early 1990s. In response, Cameroon took the matter to the International Court of Justice (ICJ) on 29 March 1994. The case was extremely complex, requiring the court to review diplomatic exchanges dating back over 100 years. Nigeria relied largely on Anglo-German correspondence dating from 1885 as well as treaties between the colonial powers and the indigenous rulers in the area, particularly the 1884 Treaty of Protection. Cameroon pointed to the Anglo-German treaty of 1913, which defined sphere of control in the region, as well as two agreements signed in the 1970s between Cameroon and Nigeria. These were the Yaoundé II Declaration of 4 April 1971 and the Maroua Declaration of 1 June 1975, which were devised to outline maritime boundaries between the two countries following their independence. The line was drawn through the Cross River estuary to the west of the peninsula, thereby implying Cameroonian ownership over Bakassi. However, Nigeria never ratified the agreement, while Cameroon regarded it as being in force. ICJ verdict. The ICJ delivered its judgment on 10 October 2002, finding (based principally on the Anglo-German agreements) that sovereignty over Bakassi did indeed rest with Cameroon. It instructed Nigeria to transfer possession of the peninsula, but did not require the inhabitants to move or to change their nationality. Cameroon was thus given a substantial Nigerian population and was required to protect their rights, infrastructure and welfare. The verdict caused consternation in Nigeria. It aroused vitriolic comments from Nigerian officials and the Nigerian media alike. Chief Richard Akinjide, a former Nigerian Attorney-General and Minister of Justice who had been a leading member of Nigeria's legal team, described the decision as "50% international law and 50% international politics", "blatantly biased and unfair", "a total disaster", and a "complete fraud". The Nigerian newspaper "The Guardian" went further, declaring that the judgment was "a rape and unforeseen potential international conspiracy against Nigerian territorial integrity and sovereignty" and "part of a Western ploy to foment and perpetuate trouble in Africa". The outcome of the controversy was a "de facto" Nigerian refusal to withdraw its troops from Bakassi and transfer sovereignty. The Nigerian government did not, however, openly reject the judgment but instead called for an agreement that would provide "peace with honour, with the interest and welfare of our people". The ICJ judgement was backed up by the United Nations, whose charter potentially allowed sanctions or even the use of force to enforce the court's ruling. Secretary-General Kofi Annan stepped in as a mediator and chaired a tripartite summit with the two countries' presidents on 15 November 2002, which established a commission to facilitate the peaceful implementation of the ICJ's judgement. A further summit was held on 31 January 2004. This made significant progress, but the process was complicated by the opposition of Bakassi's inhabitants to being transferred to Cameroon. Bakassian leaders threatened to seek independence if Nigeria renounced sovereignty. This secession was announced on 9 July 2006, as the "Democratic Republic of Bakassi". The decision was reportedly made at a meeting on 2 July 2006 and "The Vanguard" newspaper of Nigeria reported the decision to secede. The decision was reportedly made by groups of militants including Southern Cameroons under the aegis of Southern Cameroons Peoples Organisation (SCAPO), Bakassi Movement for Self-Determination (BAMOSD), and the Movement for the Emancipation of the Niger Delta (MEND). The Biafra separatist group, Biafra Nations League (BNL), initially known as Biafra Nations Youth League, led by Princewill Chimezie Richard (known as Prince Obuka) and Ebuta Akor Takon (not the former Deputy, Ebuta Ogar Takon) moved their operational base to the peninsula, after series of warnings to the Nigeria government over the plight of the internally displaced natives and the reported killing of remnants in the peninsula by Cameroon forces. This came amid clashes between Nigerian troops and the Bakassi Strike Force, a militant group that focused on attacking Nigerian and Cameroon forces. BNL Leaders were later apprehended in the Ikang-Cameroon border area on 9 November 2016 by Nigerian troops according to the "Nigeria nation" newspaper; reports linked the Biafra group to the militant groups. BNL demanded that oil companies authorized to drill for oil by Nigeria and Cameroon leave the maritime boundary area of Bakassi Peninsula. The group also threatened to attack Cameroon Forces. Resolution. On 13 June 2006, President Olusegun Obasanjo of Nigeria and President Paul Biya of Cameroon resolved the dispute in talks led by UN Secretary General Kofi Annan in New York City. Obasanjo agreed to withdraw Nigerian troops within 60 days and to leave the territory completely in Cameroonian control within the next two years. Annan said, "With today's agreement on the Bakassi peninsula, a comprehensive resolution of the dispute is within our grasp. The momentum achieved must be sustained." Nigerian withdrawal and low-level insurgency. Nigeria began to withdraw its forces, comprising some 3,000 troops, beginning 1 August 2006, and a ceremony on 14 August marked the formal handover of the northern part of the peninsula. The remainder stayed under Nigerian civil authority for two more years. On 22 November 2007, the Nigerian Senate passed a resolution declaring that the withdrawal from the Bakassi Peninsula was illegal. The government took no action, and handed the final parts of Bakassi over to Cameroon on 14 August 2008 as planned, but a Federal High Court had stated this should be delayed until all accommodations for resettled Bakassians had been settled; the government did not seem to plan to heed this court order, and set the necessary mechanisms into motion to override it. Fishermen displaced from Bakassi were first settled in a landlocked area called New Bakassi, which they claimed was already inhabited and not suitable for fishermen like them but only for farmers. The displaced people were then moved to Akpabuyo, and eventually established a new community of Dayspring. Despite the formal handover of Bakassi by Nigeria to Cameroon in 2006, the territory of Bakassi is still mentioned as part of the 774 local governments in Nigeria as embodied in the First Schedule, Part I of the 1999 Constitution of the Federal Republic of Nigeria, 1999. After the Nigerian 2015 general elections, Nigeria's 8th National Assembly still accommodates the Calabar-South/Akpabuyo/Bakassi Federal Constituency represented by Hon. Essien Ekpeyong Ayi of the People's Democratic Party. In the 2010s and 2020s, Biafran separatists, most importantly Biafra Nations League, still continue a low-level militant resistance against Cameroon in regards to Bakassi.
4757
29877045
https://en.wikipedia.org/wiki?curid=4757
Bestiary
A bestiary () is a compendium of beasts. Originating in the ancient world, bestiaries were made popular in the Middle Ages in illustrated volumes that described various animals and even rocks. The natural history and illustration of each beast was usually accompanied by a moral lesson. This reflected the belief that the world itself was the Word of God and that every living thing had its own special meaning. For example, the pelican, which was believed to tear open its breast to bring its young to life with its own blood, was a living representation of Jesus. Thus the bestiary is also a reference to the symbolic language of animals in Western Christian art and literature. History. The bestiary — the medieval book of beasts — was among the most popular illuminated texts in northern Europe during the Middle Ages (about 500–1500). Medieval Christians understood every element of the world as a manifestation of God, and bestiaries largely focused on each animal's religious meaning. Much of what is in the bestiary came from the ancient Greeks and their philosophers. The earliest bestiary in the form in which it was later popularized was an anonymous 2nd-century Greek volume called the "Physiologus", which itself summarized ancient knowledge and wisdom about animals in the writings of classical authors such as Aristotle's "Historia Animalium" and various works by Herodotus, Pliny the Elder, Solinus, Aelian and other naturalists. Following the "Physiologus", Saint Isidore of Seville (Book XII of the "Etymologiae") and Saint Ambrose expanded the religious message with reference to passages from the Bible and the Septuagint. They and other authors freely expanded or modified pre-existing models, constantly refining the moral content without interest in or access to much more detail regarding the factual content. Nevertheless, the often fanciful accounts of these beasts were widely read and generally believed to be true. A few observations found in bestiaries, such as the migration of birds, were discounted by the natural philosophers of later centuries, only to be rediscovered in the modern scientific era. Medieval bestiaries are remarkably similar in sequence of the animals of which they treat. Bestiaries were particularly popular in England and France around the 12th century and were mainly compilations of earlier texts. The Aberdeen Bestiary is one of the best known of over 50 manuscript bestiaries surviving today. Much influence comes from the Renaissance era and the general Middle Ages, as well as modern times. The Renaissance has been said to have started around the 14th century in Italy. Bestiaries influenced early heraldry in the Middle Ages, giving ideas for charges and also for the artistic form. Bestiaries continue to give inspiration to coats of arms created in our time. Two illuminated Psalters, the Queen Mary Psalter (British Library Ms. Royal 2B, vii) and the Isabella Psalter (State Library, Munich), contain full Bestiary cycles. The bestiary in the Queen Mary Psalter is found in the "marginal" decorations that occupy about the bottom quarter of the page, and are unusually extensive and coherent in this work. In fact the bestiary has been expanded beyond the source in the Norman bestiary of Guillaume le Clerc to ninety animals. Some are placed in the text to make correspondences with the psalm they are illustrating. Many decide to make their own bestiary with their own observations including knowledge from previous ones. These observations can be made in text form, as well as illustrated out. The Italian artist Leonardo da Vinci also made his own bestiary. A "volucrary" is a similar collection of the symbols of birds that is sometimes found in conjunction with bestiaries. The most widely known volucrary in the Renaissance was Johannes de Cuba's "Gart der Gesundheit" which describes 122 birds and which was printed in 1485. Bestiary content. The contents of medieval bestiaries were often obtained and created from combining older textual sources and accounts of animals, such as the "Physiologus". Medieval bestiaries contained detailed descriptions and illustrations of species native to Western Europe, exotic animals and what in modern times are considered to be imaginary animals. Descriptions of the animals included the physical characteristics associated with the creature, although these were often physiologically incorrect, along with the Christian morals that the animal represented. The description was then often accompanied by an artistic illustration of the animal as described in the bestiary. For example, in one bestiary the eagle is depicted in an illustration and is said to be the “king of birds.” Bestiaries were organized in different ways based upon the sources they drew upon. The descriptions could be organized by animal groupings, such as terrestrial and marine creatures, or presented in an alphabetical manner. However, the texts gave no distinction between existing and imaginary animals. Descriptions of creatures such as dragons, unicorns, basilisk, griffin and caladrius were common in such works and found intermingled amongst accounts of bears, boars, deer, lions, and elephants. In one source, the author explains how fables and bestiaries are closely linked to one another as “each chapter of a bestiary, each fable in a collection, has a text and has a meaning. This lack of separation has often been associated with the assumption that people during this time believed in what the modern period classifies as nonexistent or "imaginary creatures". However, this assumption is currently under debate, with various explanations being offered. Some scholars, such as Pamela Gravestock, have written on the theory that medieval people did not actually think such creatures existed but instead focused on the belief in the importance of the Christian morals these creatures represented, and that the importance of the moral did not change regardless if the animal existed or not. The historian of science David C. Lindberg pointed out that medieval bestiaries were rich in symbolism and allegory, so as to teach moral lessons and entertain, rather than to convey knowledge of the natural world. Religious significance. The significance shown between animals and religion started much before bestiaries came into play.  In many ancient civilizations there are references to animals and their meaning within that specific religion or mythology that we know of today. These civilizations included Egypt and their gods with the faces of animals or Greece which had symbolic animals for their godly beings, an example being Zeus and the eagle. With animals being a part of religion before bestiaries and their lessons came out, they were influenced by past observations of meaning as well as older civilizations and their interpretations. As most of the students who read these bestiaries were monks and clerics, it is not impossible to say that there is a major religious significance within them. The bestiary was used to educate young men on the correct morals they should display. All of the animals presented in the bestiaries show some sort of lesson or meaning when presented. Much of the symbolism shown of animals. Much of what is proposed by the bestiaries mentions much of paganism because of the religious significance and time period of the medieval ages. One of the main 'animals' mentioned in some of the bestiaries is dragons, which hold much significance in terms of religion and meaning. The unnatural part of dragon's history shows how important the church can be during this time. Much of what is covered in the article talks about how the dragon that is mentioned in some of the bestiaries shows a glimpse of the religious significance in many of these tales. These bestiaries held much content in terms of religious significance. In almost every animal there is some way to connect it to a lesson from the church or a familiar religious story. With animals holding significance since ancient times, it is fair to say that bestiaries and their contents gave fuel to the context behind the animals, whether real or myth, and their meanings. Modern bestiaries. In modern times, artists such as Henri de Toulouse-Lautrec and Saul Steinberg have produced their own bestiaries. Jorge Luis Borges wrote a contemporary bestiary of sorts, the "Book of Imaginary Beings", which collects imaginary beasts from bestiaries and fiction. Nicholas Christopher wrote a literary novel called "The Bestiary" (Dial, 2007) that describes a lonely young man's efforts to track down the world's most complete bestiary. John Henry Fleming's "Fearsome Creatures of Florida" (Pocol Press, 2009) borrows from the medieval bestiary tradition to impart moral lessons about the environment. Caspar Henderson's "The Book of Barely Imagined Beings" (Granta 2012, University of Chicago Press 2013), subtitled "A 21st Century Bestiary", explores how humans imagine animals in a time of rapid environmental change. In July 2014, Jonathan Scott wrote "The Blessed Book of Beasts", Eastern Christian Publications, featuring 101 animals from the various translations of the Bible, in keeping with the tradition of the bestiary found in the writings of the Saints, including Saint John Chrysostom. In today's world there is a discipline called cryptozoology which is the study of unknown species. This discipline can be linked to medieval bestiaries because in many cases the unknown animals can be the same, as well as having meaning or significance behind them. The lists of monsters to be found in video games (such as "NetHack", "Dragon Quest", and "Monster Hunter"), as well as some tabletop role-playing games such as Warhammer Fantasy Roleplay, Dungeons & Dragons and Pathfinder, are often termed bestiaries.
4760
7903804
https://en.wikipedia.org/wiki?curid=4760
The Ballad of the Green Berets
"The Ballad of the Green Berets" is a 1966 patriotic song co-written and performed by Barry Sadler, in the style of a ballad about the United States Army Special Forces. It was one of the few popular songs of the Vietnam War years to cast the military in a positive light. The song became a major hit in January 1966, reaching number one for five weeks on the "Billboard" Hot 100, and was ranked number one of that chart's most successful songs of 1966. It was also a crossover hit, reaching number one on "Billboard" Easy Listening chart and number two on "Billboard" Country survey. "The Ballad of the Green Berets" was the most commercially successful topical song of the Vietnam War era. Background. Sadler began writing the song while he was training to be a Special Forces medic. After earning his Parachutist Badge, Sadler knew one line of the song would mention "silver wings upon their chests." Author Robin Moore, who wrote the book "The Green Berets", helped him write the lyrics and later sign a recording contract with RCA Records. The demonstation tape of the song was produced in a rudimentary recording studio at Fort Bragg, North Carolina, with the help of Gerry Gitell and Lieutenant General William P. Yarborough. The lyrics were written, in part, in honor of U.S. Army Specialist 5 James Gabriel Jr., a Special Forces operator and the first native Hawaiian to die in Vietnam. Gabriel was killed by Viet Cong gunfire while on a training mission with the South Vietnamese Army on April 8, 1962. One verse mentioned Gabriel by name, but it was not used in the recorded version. Sadler recorded the song and 11 others with Sid Bass at RCA's 24th Street Studios in New York City on December 18, 1965. The song and album, "Ballads of the Green Berets", were released in January 1966. Release and reception. In the United States, "The Ballad of the Green Berets" shipped two million copies in its first five weeks of release, making it the then-fastest selling single in RCA's history. It topped the "Billboard" Hot 100 in March 1966, staying at number one for five weeks. It placed tenth on the year-end Hot 100 chart published by "Billboard" in December 1966. When "Billboard" later revised its year-end rankings for 1966, the song was re-ranked at number one; since then, "Billboard" has recognized "The Ballad of the Green Berets" as the top Hot 100 song of that year. On "Cash Box" 1966 year-end chart, "The Ballad of the Green Berets" tied for first with "California Dreamin'" by the Mamas and the Papas. It was also the number-21 song of the 1960s as ranked by Joel Whitburn. The single sold more than nine million copies.</ref> Sadler performed the song on television on January 30, 1966, on "The Ed Sullivan Show", and on other TV shows, including "The Hollywood Palace" and "The Jimmy Dean Show". "The Ballad of the Green Berets" was uniquely successful in an era of protest songs and anti-Vietnam War sentiment, focusing not on battle but the humanity of the soldiers. Its appearance on the Billboard Country chart, despite not having any overtly country music traits, is a testament to its broad appeal. In terms of sales and chart activity, musicologist R. Serge Denisoff called "The Ballad of the Green Berets" the most successful topical song of the Vietnam War era. In film. "The Ballad of the Green Berets" was used in the 1968 John Wayne film "The Green Berets", based on Robin Moore's book. Though Wayne personally requested the song as his theme, the film's composer, Miklós Rózsa, feared it would sound old-fashioned. Nevertheless, Rózsa made two distinct arrangements of the song for the film's opening and closing credits. Though he considered the results "corny," he realized the charismatic Wayne could "get away" with having the song as his theme. "I don't think anyone mentioned whether the music was good or bad," said Rózsa of the film's predominately negative reviews.
4763
48643156
https://en.wikipedia.org/wiki?curid=4763
Baroque dance
Baroque dance is dance of the Baroque era (roughly 1600–1750), closely linked with Baroque music, theatre, and opera. English country dance. The majority of surviving choreographies from the period are English country dances, such as those in the many editions of Playford's "The Dancing Master". The descriptions in these various publications give the music, the formation, the number of dancers, and textual descriptions of the figures to be danced in relation to the musical bars, i.e. the floor patterns of the dances. There is only occasional indication of the steps used, presumably because they were well known. However, other sources of the period, such as the writings of the French dancing-masters Feuillet and Lorin, indicate that steps more complicated than simple walking were used, at least some of the time. English country dance survived well beyond the Baroque era and eventually spread in various forms across Europe and its colonies, and to all levels of society. The French Noble style. The great innovations in dance in the 17th century originated at the French court under Louis XIV, and it is here that we see the first clear stylistic ancestor of classical ballet. The same basic technique was used both at social events, and as theatrical dance in court ballets and at public theaters. The style of dance is commonly known to modern scholars as the "French noble style" or "belle danse" (French, literally "beautiful dance"), however it is often referred to casually as "baroque dance" in spite of the existence of other theatrical and social dance styles during the baroque era. Primary sources include more than three hundred choreographies in Beauchamp–Feuillet notation, as well as manuals by Raoul Auger Feuillet and Pierre Rameau in France, Kellom Tomlinson, P. Siris, and John Weaver in England, and Gottfried Taubert in Germany (i.e. Leipzig, Saxony). This wealth of evidence has allowed modern scholars and dancers to recreate the style, although areas of controversy still exist. The standard modern introduction is Hilton. French dance types include: The English, working in the French style, added their own hornpipe to this list. Many of these dance types are familiar from baroque music, perhaps most spectacularly in the stylized suites of J. S. Bach. Note, however, that the allemandes, that occur in these suites do not correspond to a French dance from the same period. Theatrical dance. The French noble style was danced both at social events and by professional dancers in theatrical productions such as opera-ballets and court entertainments. However, 18th-century theatrical dance had at least two other styles: comic or grotesque, and semi-serious. Other social dance styles. Other dance styles, such as the Italian and Spanish dances of the period, are much less well studied than either English country dance or the French style. The general picture seems to be that during most of the 17th century, a style of late Renaissance dance was widespread, but as time progressed, French ballroom dances such as the minuet were widely adopted at fashionable courts. Beyond this, the evolution and cross-fertilisation of dance styles is an area of ongoing research. Modern reconstructions. The revival of baroque music in the 1960s and '70s sparked renewed interest in 17th and 18th century dance styles. While more than 300 of these dances had been preserved in Beauchamp–Feuillet notation, it wasn't until the mid-20th century that serious scholarship commenced in deciphering the notation and reconstructing the dances. Perhaps best known among these pioneers was Britain's Melusine Wood, who published several books on historical dancing in the 1950s. Wood passed her research on to her student Belinda Quirey, and also to Pavlova Company ballerina and choreographer Mary Skeaping (1902–1984). The latter became well known for her reconstructions of baroque ballets for London's "Ballet for All" company in the 1960s. The leading figures of the second generation of historical dance research include Shirley Wynne and her Baroque Dance Ensemble which was founded at Ohio State University in the early 1970s and Wendy Hilton (1931–2002), a student of Belinda Quirey who supplemented the work of Melusine Wood with her own research into original sources. A native of Britain, Hilton arrived in the U.S. in 1969 joining the faculty of the Juilliard School in 1972 and establishing her own baroque dance workshop at Stanford University in 1974 which endured for more than 25 years. Catherine Turocy (b. circa 1950) began her studies in Baroque dance in 1971 as a student of dance historian Shirley Wynne. She founded The New York Baroque Dance Company in 1976 with Ann Jacoby, and the company has since toured internationally. In 1982/83 as part of the French national celebration of Jean Philippe Rameau's 300th birthday, Turocy choreographed the first production of Jean-Philippe Rameau's "Les Boréades" - it was never performed during the composer's lifetime. This French supported production with John Eliot Gardiner, conductor, and his orchestra was directed by Jean Louis Martinoty. Turocy has been decorated as Chevalier in the Ordre des Arts et des Lettres by the French government. In 1973, French dance historian Francine Lancelot (1929–2003) began her formal studies in ethnomusicology which later led her to research French traditional dance forms and eventually Renaissance and Baroque dances. In 1980, at the invitation of the French Minister of Culture, she founded the baroque dance company "Ris et Danceries". Her work in choreographing the landmark 1986 production of Lully's 1676 tragedie-lyrique "Atys" was part of the national celebration of the 300th anniversary of Lully's death. This production propelled the career of William Christie and his ensemble Les Arts Florissants. Since the Ris et Danseries company was disbanded circa 1993, choreographers from the company have continued with their own work. Béatrice Massin with her "Compagnie Fetes Galantes", along with Marie-Geneviève Massé and her company "L'Eventail" are among the most prominent. In 1995 Francine Lancelot's catalogue raisonné of baroque dance, entitled "La Belle Dance", was published.
4764
47812778
https://en.wikipedia.org/wiki?curid=4764
Borzoi
The Borzoi or Russian Hunting Sighthound is a Russian breed of hunting dog of sighthound type. It was formerly used for wolf hunting; until 1936, the breed was known as the Russian Wolfhound. Etymology. The system by which Russians over the ages named their sighthounds was a series of descriptive terms rather than actual names. is the masculine singular form of an archaic Russian adjective that means 'fast'. ('fast dog') is the basic term for sighthounds used by Russians, though is usually dropped. The name derived from the word , which means 'wavy, silky coat', just as (as in hortaya borzaya) means shorthaired. In modern Russian, the breed commonly called the Borzoi is officially known as . Other Russian sighthound breeds are (from the steppe), called ; and (from the Crimea), called . History. The Borzoi originated in the sixteenth century Russia by crossing Saluki and European sighthounds with thick-coated Russian breeds. The Borzoi was popular with the Tsars before the 1917 revolution. For centuries, Borzois could not be purchased but only given as gifts from the Tsar. Grand Duke Nicholas Nicolaievich of Russia bred countless Borzoi at Perchino, his private estate. The breed was almost rendered extinct after the revolution, as the communists associated the breed with the upper classes and killed Borzoi in large numbers. The Russkaya Psovaya Borzaya was definitively accepted by the Fédération Cynologique Internationale in 1956. Description. Appearance. Borzois are large Russian sighthounds that resemble some central Asian breeds such as the Afghan hound, Saluki, and the Kyrgyz Taigan. Borzois come in a variety of colours. The Borzoi coat is silky and flat, often wavy or slightly curly. The long top-coat is quite flat, with varying degrees of waviness or curling. The soft undercoat thickens during winter or in cold climates, but is shed in hot weather to prevent overheating. In its texture and distribution over the body, the Borzoi coat is unique. There should be a frill on its neck, as well as feathering on its hindquarters and tail. Temperament. The Borzoi is an affectionate and athletic breed of dog with a calm temperament. In terms of obedience, Borzois are selective learners who quickly become bored with repetitive activity, and they can be difficult to motivate. Nevertheless, Borzois are definitely capable of enjoying and performing well in competitive obedience and agility trials with the right kind of training. Health. A 2024 UK study found an average life expectancy of 12 years for Borzois, with a sample size of 43, compared to 12.7 for purebreds and 12 for mongrels. An American study looking at echocardiographs of clinically healthy Borzoi found 53.8% to have heart murmurs, 30.2% to have trace or mild mitral regurgitation, 36.1% to have mild tricuspid regurgitation, and 14.4% to have cardiac disease.
4765
49927710
https://en.wikipedia.org/wiki?curid=4765
Basenji
The Basenji () is a breed of hunting dog created from stock that originated in Central Africa, including in the Republic of the Congo and other adjacent tropical African countries. The Fédération Cynologique Internationale places the Basenji in the Spitz and "primitive types" categories, while the American Kennel Club classifies it as a hound. The breed does not bark in the traditional manner of most dogs, rather vocalising in an unusual, yodel-like "talking" sound, due to its unusually-shaped larynx. This trait earns the Basenji its nickname of "barkless" dog, a similar feature seen and heard in the New Guinea singing dog. Basenjis are athletic small dogs that can run up to , and share many distinctive traits with the pye or pariah dog types of the Indian subcontinent. In addition to their uniquely similar vocalisations, the Basenji, the Australian dingo and the aforementioned New Guinea singing dog all only come into estrus once per year, as does the Tibetan Mastiff; other dog breeds may have two or more breeding seasons each year. Basenjis lack a distinctive odor, or "dog smell". Name. In Swahili, translates to "savage dog". Another local name is "m'bwa m'kube, 'mbwa wa mwitu" "wild dog", or "dog that jumps up and down", a reference to their tendency to jump straight up to spot their quarry. The dogs are also known to the Azande of South Sudan as . Lineage. The Basenji has been identified as a basal breed that predates the emergence of the modern breeds in the 19th century. DNA studies based on whole-genome sequences indicate that the basenji and the dingo are both considered to be basal members of the domestic dog clade. In 2021, the genome of two basenjis were assembled, which indicated that the basenji fell within the Asian spitz group. The AMY2B gene produces an enzyme, amylase, that helps to digest starch. The wolf, the husky and the dingo possess only two copies of this gene, which provides evidence that they arose before the expansion of agriculture. The genomic study found that similarly, the basenji possesses only two copies of this gene. History. The Basenji originated on the continent of Africa, where it has been identified with Egyptian depictions of dogs with curled tails and erect ears, a breed called Tesem which is found in murals as old as 4,500 years. Europeans first described the breed which became the Basenji in 1895 in the Congo. These dogs were prized by locals for their intelligence, courage, speed, and silence. Several attempts were made to introduce the breed into England, but the earliest imports succumbed to disease. In 1923 six Basenjis were taken from Sudan, but all six died from distemper shots received in quarantine. It was not until the 1930s that foundation stock was successfully established in England, and then in the United States by animal importer Henry Trefflich. It is likely that nearly all the Basenjis in the Western world are descended from these few original imports. The breed was officially accepted into the AKC in 1943. In 1990, the AKC stud book was reopened to 14 new imports at the request of the Basenji Club of America. The stud book was reopened again to selected imported dogs from 1 January 2009 to 31 December 2013. An American-led expedition collected breeding stock in villages in the Basankusu area of the Democratic Republic of Congo, in 2010. Basenjis are also registered with the United Kennel Club. The popularity of the Basenji in the United States, according to the American Kennel Club, has declined over the past decade, with the breed ranked 71st in 1999, decreasing to 84th in 2006, and to 93rd in 2011. Characteristics. Appearance. Basenjis are small, short-haired dogs with erect ears, tightly curled tails and graceful necks. A Basenji's forehead is wrinkled, even more so when it is young or extremely excited. A Basenji's eyes are typically almond-shaped. Basenjis typically weigh about and stand at the shoulder. They are a square breed, which means they are as long as they are tall with males usually larger than females. Basenjis are athletic dogs, and deceptively powerful for their size. The FCI standard states that when moving their legs should be: 'carried straight forward with a swift, long, tireless, swinging stride.' Basenjis come in a few different colorations: red, black, tricolor, and brindle, and they all have white feet, chests and tail tips. Temperament and behavior. The Basenji is alert, energetic, curious and reserved with strangers. The Basenji tends to become emotionally attached to a single human. Basenjis may not get along with non-canine pets. Basenjis dislike wet weather, much like cats, and will often refuse to go outside in any sort of damp conditions. They like to climb, and can easily scale chain wire/link fences. Basenjis often stand on their hind legs, somewhat like a meerkat, by themselves or leaning on something; this behavior is often observed when the dog is curious about something. Basenjis have a strong prey drive. According to the book "The Intelligence of Dogs", they are the second least trainable dog, when required to do human commands (behind only the Afghan Hound). Their real intelligence manifests when they are required to actually solve problems for the sake of the dogs' own goals (such as food, or freedom). Basenjis are highly prey driven and will go after cats and other small animals. Health. There is only one completed health survey of dog breeds, including the Basenji, that was conducted by the UK Kennel Club in 2004. The survey indicated the prevalence of diseases in Basenjis with dermatitis (9% of responses), incontinence and bladder infection (5%), hypothyroidism (4%), pyometra and infertility (4%). Longevity. Basenjis in the 2004 UK Kennel Club survey had a median lifespan of 13.6 years (sample size of 46 deceased dogs), which is 1–2 years longer than the median lifespan of other breeds of similar size. The oldest dog in the survey was 17.5 years. The most common causes of death were old age (30%), urologic (incontinence, Fanconi syndrome, chronic kidney failure 13%), behavior ("unspecified" and aggression 9%), and cancer (9%). Fanconi syndrome. Fanconi syndrome, an inheritable disorder in which the renal (kidney) tubes fail to reabsorb electrolytes and nutrients, is unusually common in Basenjis. Symptoms include excessive thirst, excessive urination, and glucose in the urine, which may lead to a misdiagnosis of diabetes. Fanconi syndrome usually presents between 4 and 8 years of age, but sometimes as early as 3 years or as late as 10 years. Fanconi syndrome is treatable and organ damage is reduced if treatment begins early. Basenji owners are advised to test their dog's urine for glucose once a month beginning at the age of 3 years. Glucose testing strips designed for human diabetics are inexpensive and available at most pharmacies. A Fanconi disease management protocol has been developed that can be used by veterinarians to treat Fanconi-afflicted dogs. Other Basenji health issues. Basenjis sometimes carry a simple recessive gene that, when homozygous for the defect, causes genetic hemolytic anemia (HA). Most 21st-century Basenjis are descended from ancestors that have tested clean. When lineage from a fully tested line (set of ancestors) cannot be completely verified, the dog should be tested before breeding. As this is a non-invasive DNA test, a Basenji can be tested for HA at any time. Basenjis sometimes suffer from hip dysplasia, resulting in loss of mobility and arthritis-like symptoms. All dogs should be tested by either OFA or PennHIP prior to breeding. Malabsorption, or immunoproliferative enteropathy, is an autoimmune intestinal disease that leads to anorexia, chronic diarrhea, and even death. A special diet can improve the quality of life for afflicted dogs. The breed can also fall victim to progressive retinal atrophy (a degeneration of the retina causing blindness) and several less serious hereditary eye problems such as coloboma (a hole in the eye structure), and persistent pupillary membrane (tiny threads across the pupil).
4768
7903804
https://en.wikipedia.org/wiki?curid=4768
Brit milah
The brit milah (, , ; "covenant of circumcision"), or bris (, ), is the ceremony of circumcision in Judaism and Samaritanism during which a newborn male's foreskin is surgically removed. According to the Book of Genesis, God commanded the biblical patriarch Abraham to be circumcised: an act to be followed by his descendants on the eighth day of life symbolizing the covenant between God and the Jewish people. Today, it is generally performed by a mohel on the eighth day after the infant's birth and is followed by a celebratory meal known as a "seudat mitzvah". "Brit milah" is considered among the most important and central commandments in Judaism, and the rite has played a central role in Jewish history and civilization. The Talmud, when discussing the importance of "brit milah", considers it equal to all other "mitzvot" (commandments). Abraham's descendants who voluntarily fail to undergo "brit milah", barring extraordinary circumstances, are believed to suffer "Kareth", which, in Jewish theology, the extinction of the soul and denial of a share in the World to Come. The "brit" is understood by Jews to signify acceptance into the ongoing covenant between God and the Jewish people, which is why "gerim" must undergo a form of "brit" to finalize conversion. Historical conflicts between Jews and European civilizations have occurred several times over "brit milah", including multiple campaigns of Jewish ethnic, cultural, and religious persecution, with subsequent bans and restrictions on the practice as an attempted means of forceful assimilation, conversion, and ethnocide, most famously in the Maccabean Revolt by the Seleucid Empire. "In Jewish history, the banning of circumcision ("brit milah") has historically been a first step toward more extreme and violent forms of persecution". These periods have generally been linked to suppression of Jewish religious, ethnic, and cultural identity and subsequent "punishment at the hands of government authorities for engaging in circumcision". The Maccabee victory in the Maccabean Revolt—ending the prohibition against circumcision—is celebrated in Hanukkah. Circumcision rates are near-universal among Jews. "Brit milah" also has immense importance in other religions. The Gospel of Luke records that Mary and Joseph, the parents of Jesus, had him undergo circumcision. Origins (unknown to 515 BCE). The origin of circumcision is not known with certainty; however, artistic and literary evidence from ancient Egypt suggests it was practiced in the ancient Near East from at least the Sixth Dynasty (–2181 BCE). According to some scholars, it appears that it only appeared as a sign of the covenant during the Babylonian Exile. Scholars who posit the existence of a hypothetical J source (likely composed during the seventh century BCE) of the Pentateuch in hold that it would not have mentioned a covenant that involves the practice of circumcision. Only in the P source (likely composed during the sixth century BCE) of does the notion of circumcision become linked to a covenant. Some scholars have argued that it originated as a replacement for child sacrifice. Biblical references. According to the Hebrew Bible, Adonai commanded the biblical patriarch Abraham to be circumcised, an act to be followed by his descendants: Leviticus 12:3 says: "And in the eighth day the flesh of his foreskin shall be circumcised." According to the Hebrew Bible, it was "a reproach" for an Israelite to be uncircumcised. The plural term ("uncircumcised") is used opprobriously, denoting the Philistines and other non-Israelites and used in conjunction with (unpure) for heathen. The word ("uncircumcised" [singular]) is also employed for "impermeable"; it is also applied to the first three years' fruit of a tree, which is forbidden. However, the Israelites born in the wilderness after the Exodus from Egypt were not circumcised. Joshua 5:2–9, explains, "all the people that came out" of Egypt were circumcised, but those "born in the wilderness" were not. Therefore, Joshua, before the celebration of the Passover, had them circumcised at Gilgal specifically before they entered Canaan. Abraham, too, was circumcised when he moved into Canaan. The prophetic tradition emphasizes that God expects people to be good as well as pious, and that non-Jews will be judged based on their ethical behavior, see Noahide Law. Thus, Jeremiah 9:25–26 says that circumcised and uncircumcised will be punished alike by the Lord; for "all the nations are uncircumcised, and all the house of Israel are uncircumcised in heart". The penalty of willful non-observance is "kareth" (making oneself liable to extirpation or excommunication), as noted in Genesis 17:1-14. Conversion to Judaism for non-Israelites in Biblical times necessitated circumcision, otherwise one could not partake in the Passover offering. Today, as in the time of Abraham, it is required of converts in Orthodox, Conservative and Reform Judaism. As found in Genesis 17:1–14, "brit milah" is considered to be so important that should the eighth day fall on the Sabbath, actions that would normally be forbidden because of the sanctity of the day are permitted in order to fulfill the requirement to circumcise. The Talmud, when discussing the importance of Milah, compares it to being equal to all other mitzvot (commandments) based on the gematria for "brit" of 612. Covenants in biblical times were often sealed by severing an animal, with the implication that the party who breaks the covenant will suffer a similar fate. In Hebrew, the verb meaning "to seal a covenant" translates literally as "to cut". It is presumed by Jewish scholars that the removal of the foreskin symbolically represents such a sealing of the covenant. Speculated reasons for biblical circumcision include to show off "patrilineal descent, sexual fertility, male initiation, cleansing of birth impurity, and dedication to God". Ceremony. "Mohalim". "Mohalim" are Jews trained in the practice of "brit milah", the "covenant of circumcision". According to traditional Jewish law, in the absence of a grown free Jewish male expert, anyone who has the required skills is also authorized to perform the circumcision, if they are Jewish. Yet, most streams of non-Orthodox Judaism allow women to be " (, plural of , feminine of )", without restriction. In 1984, Deborah Cohen became the first certified Reform ""; she was certified by the "brit milah" program of Reform Judaism. Time and place. It is customary for the "brit" to be held in a synagogue, but it can also be held at home or any other suitable location. The "brit" is performed on the eighth day from the baby's birth, taking into consideration that according to the Jewish calendar, the day begins at the sunset of the day before. If the baby is born on Sunday before sunset, the "brit" will be held the following Sunday. However, if the baby is born on Sunday night after sunset, the "brit" is on the following Monday. The "brit" takes place on the eighth day following birth even if that day is Shabbat or a holiday; however, if the baby is born on Friday night between sunset and nightfall, the "briti" is delayed until the following Sunday. Furthermore, the brit is performed on the Sabbath only if it is a natural birth; if the birth is a Caesarean section, the "brit" is delayed until the Sunday. A brit is traditionally performed in the morning, but it may be performed any time during daylight hours. Postponement for health reasons. The Talmud explicitly instructs that a boy must not be circumcised if he had two brothers who died due to complications arising from their circumcisions, and Maimonides says that this excluded paternal half-brothers. This may be due to a concern about hemophilia. An Israeli study found a high rate of urinary tract infections if the bandage is left on too long. If the child is born prematurely or has other serious medical problems, the "brit milah" will be postponed until the doctors and "mohel" deem the child strong enough for his foreskin to be surgically removed. Adult circumcision. In recent years, the circumcision of adult Jews who were not circumcised as infants has become more common than previously thought. In such cases, the "brit milah" will be done at the earliest date that can be arranged. The actual circumcision will be private, and other elements of the ceremony (e.g., the celebratory meal) may be modified to accommodate the desires of the one being circumcised. Circumcision for the dead. According to Halacha, a baby who dies before they had time to circumcise him must be circumcised before burial. Several reasons were given for this commandment. Some have written that there is no need for this circumcision. Anesthetic. Most prominent "acharonim" rule that the "mitzvah" of "brit milah" lies in the pain it causes, and anesthetic, sedation, or ointment should generally not be used. However, it is traditionally common to feed the infant a drop of wine or other sweet liquid to soothe him. Eliezer Waldenberg, Yechiel Yaakov Weinberg, Shmuel Wosner, Moshe Feinstein and others agree that the child should not be sedated, although pain relieving ointment may be used under certain conditions; Shmuel Wosner particularly asserts that the act ought to be painful, per Psalm 44:23. In a letter to the editor published in "The New York Times" on January 3, 1998, Rabbi Moshe David Tendler disagrees with the above and writes, "It is a biblical prohibition to cause anyone unnecessary pain." Rabbi Tendler recommends the use of an analgesic cream. However, lidocaine should not be used because it has been linked to several pediatric near-death episodes. "Kvater". The title of "kvater" () among Ashkenazi Jews is for the person who carries the baby from the mother to the father, who in turn carries him to the "mohel." This honor is usually given to a couple without children, as a merit or "segula" (efficacious remedy) that they should have children of their own. The origin of the term is Middle High German "gevater/gevatere" ("godfather"). "Seudat mitzvah". After the ceremony, a celebratory meal takes place. At the "birkat hamazon", according to the Eastern Asheknazic rite, additional introductory lines, known as "Nodeh Leshimcha", are added. These lines praise God and request the permission of God, the Torah, Kohanim and distinguished people present to proceed with the grace. When the four main blessings are concluded, special "ha-Rachaman" prayers are recited. They request various blessings by God that include: According to the Western Ashkenazic rite, "Nodeh Leshimcha" is not recited. "Elohim tzivita li-yedidcha bechiracha" is recited during the second blessing, and a set of "ha-Rachaman" prayers, different from the ones in the Eastern Ashkenazic rite, are recited. Ritual components. Uncovering, "priah". At the neonatal stage, the inner preputial epithelium is still linked with the surface of the glans. The "mitzvah" is executed only when this epithelium is either removed, or permanently peeled back to uncover the glans. On medical circumcisions performed by surgeons, the epithelium is removed along with the foreskin, to prevent post operative penile adhesion and its complications. However, on ritual circumcisions performed by a "mohel", the epithelium is most commonly peeled off only after the foreskin has been amputated. This procedure is called "priah" (), which means 'uncovering'. The main goal of "priah" (also known as "bris periah"), is to remove as much of the inner layer of the foreskin as possible and prevent the movement of the shaft skin, what creates the look and function of what is known as a "low and tight" circumcision. According to Rabbinic interpretation of traditional Jewish sources, the 'priah' has been performed as part of the Jewish circumcision since the Israelites first inhabited the Land of Israel. The "Oxford Dictionary of the Jewish Religion" states that many Hellenistic Jews attempted to restore their foreskins, and that similar action was taken during the Hadrianic persecution, a period in which a prohibition against circumcision was issued. The writers of the dictionary hypothesize that the more severe method practiced today was probably begun in order to prevent the possibility of restoring the foreskin after circumcision, and therefore the rabbis added the requirement of cutting the foreskin in periah. According to Shaye J. D. Cohen, the Torah only commands milah. David Gollaher has written that the rabbis added the procedure of priah to discourage men from trying to restore their foreskins: "Once established, priah was deemed essential to circumcision; if the mohel failed to cut away enough tissue, the operation was deemed insufficient to comply with God's covenant", and "Depending on the strictness of individual rabbis, boys (or men thought to have been inadequately cut) were subjected to additional operations." note: alternate spellings Metzizah or Metsitsah are also used to refer to this. "". In the Metzitzah (), the guard is slid over the foreskin as close to the glans as possible to allow for maximum removal of the former without any injury to the latter. A scalpel is used to detach the foreskin. A tube is used for "metzitzah" in addition to ' (the initial cut amputating the akroposthion) and ' and subsequent circumcision, mentioned above, the Talmud (Mishnah Shabbat 19:2) mentions a third step, "", translated as suction, as one of the steps involved in the circumcision rite. The Talmud writes that a "Mohel (Circumciser) who does not suck creates a danger, and should be dismissed from practice". Rashi on that Talmudic passage explains that this step is in order to draw some blood from deep inside the wound to prevent danger to the baby. Kabbalists, Rabbi Shalom Sharabi and Rabbi Isaac Luria, have written that he that performs "metzitzah" ought to cognizantly endeavor to draw away the 'evil inclination' that lay within the blood that is extracted. There are other modern antiseptic and antibiotic techniques—all used as part of the "brit milah" today—which many say accomplish the intended purpose of "metzitzah"; however, since "metzitzah" is one of the four steps to fulfill the mitzvah, it continues to be practiced by Orthodox Jews. while traditional Karaites and Beta Israel never practiced it due to never incorporating the Talmud. "" (oral suction). The traditional method of performing "metzitzah b'peh" (, abbreviated as MBP)—or oral suction—has become controversial. The process has the "mohel" place his mouth directly on the infant's genital wound to suck blood away from the cut. Many circumcision ceremonies no longer use metzitzah b'peh, but Haredi Jews continue to perform it. The practice poses a serious risk of spreading herpes to the infant. Proponents maintain that there is no conclusive evidence that links herpes to "Metzitza", and that attempts to limit this practice infringe on religious freedom. The practice has become a controversy in both secular and Jewish medical ethics. The ritual of "metzitzah" is found in Mishnah Shabbat 19:2, which lists it as one of the four steps involved in the circumcision rite. Rabbi Moses Sofer, also known as the Chatam Sofer (1762–1839), observed that the Talmud states that the rationale for this part of the ritual was hygienic—i.e., to protect the health of the child. As such, the Chatam Sofer issued a ruling to perform "metzitzah" with a sponge instead of oral suction in order to safeguard the child from potential risks. He also cited a passage in Nedarim 32a as a warrant for the position that metzitzah b’peh was not an obligatory part of the circumcision ceremony. It relates the story that a mohel (who was suspected of transmitting herpes via metzizah to infants) was checked several times and never found to have signs of the disease and that a ban was requested because of the "possibility of future infections". Moshe Schick (1807–1879), a student of Moses Sofer, states in his book of Responsa, "She’eilos u’teshuvos Maharam Schick" (Orach Chaim 152,) that Moses Sofer gave the ruling in that specific instance only because the mohel refused to step down and had secular government connections that prevented his removal in favor of another mohel, and the Heter may not be applied elsewhere. He also states ("Yoreh Deah" 244) that the practice is possibly a Sinaitic tradition, i.e., Halacha l'Moshe m'Sinai. Other sources contradict this claim, with copies of Moses Sofer's responsa making no mention of the legal case or of his ruling applying in only one situation. Rather, that responsa makes quite clear that "metzizah" was a health measure and should never be employed where there is a health risk to the infant. Chaim Hezekiah Medini, after corresponding with the greatest Jewish sages of the generation, concluded the practice to be Halacha l'Moshe m'Sinai and elaborates on what prompted Moses Sofer to give the above ruling. He tells the story that a student of Moses Sofer, Lazar Horowitz, Chief Rabbi of Vienna at the time and author of the responsa "Yad Elazer", needed the ruling because of a governmental attempt to ban circumcision completely if it included "metztitzah b'peh." He therefore asked Sofer to give him permission to do "brit milah" without "metzitzah b'peh." When he presented the defense in secular court, his testimony was erroneously recorded to mean that Sofer stated it as a general ruling. The Rabbinical Council of America (RCA), which claims to be the largest American organization of Orthodox rabbis, published an article by mohel Yehudi Pesach Shields in its summer 1972 issue of "Tradition" magazine, calling for the abandonment of Metzitzah b'peh. Since then the RCA has issued an opinion that advocates methods that do not involve contact between the mohel's mouth and the infant's genitals, such as the use of a sterile syringe, thereby eliminating the risk of infection. According to the Chief Rabbinate of Israel and the Edah HaChareidis "metzitzah b'peh" should still be performed. The practice of "metzitzah b'peh" posed a serious risk in the transfer of herpes from mohelim to eight Israeli infants, one of whom suffered brain damage. When three New York City infants contracted herpes after "metzizah b'peh" by one "mohel" and one of them died, New York authorities took out a restraining order against the "mohel" requiring use of a sterile glass tube, or pipette. The mohel's attorney argued that the New York Department of Health had not supplied conclusive medical evidence linking his client with the disease. In September 2005, the city withdrew the restraining order and turned the matter over to a rabbinical court. Thomas Frieden, the Health Commissioner of New York City, wrote, "There exists no reasonable doubt that 'metzitzah b'peh' can and has caused neonatal herpes infection...The Health Department recommends that infants being circumcised not undergo metzitzah b'peh." In May 2006, the Department of Health for New York State issued a protocol for the performance of metzitzah b'peh. Antonia Novello, Commissioner of Health for New York State, together with a board of rabbis and doctors, worked, she said, to "allow the practice of metzizah b'peh to continue while still meeting the Department of Health's responsibility to protect the public health". Later in New York City in 2012 a 2-week-old baby died of herpes because of metzitzah b'peh. In three medical papers done in Israel, Canada, and the US, oral suction following circumcision was suggested as a cause in 11 cases of neonatal herpes. Researchers noted that prior to 1997, neonatal herpes reports in Israel were rare, and that the late instances were correlated with the mothers carrying the virus themselves. Rabbi Doctor Mordechai Halperin implicates the "better hygiene and living conditions that prevail among the younger generation", which lowered to 60% the rate of young Israeli Haredi mothers who carry the virus. He explains that an "absence of antibodies in the mothers' blood means that their newborn sons received no such antibodies through the placenta, and therefore are vulnerable to infection by HSV-1". Barriers. Because of the risk of infection, some rabbinical authorities have ruled that the traditional practice of direct contact should be replaced by using a sterile tube between the wound and the mohel's mouth, so there is no direct oral contact. The Rabbinical Council of America, the largest group of Modern Orthodox rabbis, endorses this method. The RCA paper states: "Rabbi Schachter even reports that Rav Yosef Dov Soloveitchik reports that his father, Rav Moshe Soloveitchik, would not permit a mohel to perform metzitza be’peh with direct oral contact, and that his grandfather, Rav Chaim Soloveitchik, instructed mohelim in Brisk not to do metzitza be’peh with direct oral contact. However, although Rav Yosef Dov Soloveitchik also generally prohibited metzitza be’peh with direct oral contact, he did not ban it by those who insisted upon it." The sefer Mitzvas Hametzitzah by Rabbi Sinai Schiffer of Baden, Germany, states that he is in possession of letters from 36 major Russian (Lithuanian) rabbis that categorically prohibit Metzitzah with a sponge and require it to be done orally. Among them is Rabbi Chaim Halevi Soloveitchik of Brisk. In September 2012, the New York Department of Health unanimously ruled that the practice of metztizah b'peh should require informed consent from the parent or guardian of the child undergoing the ritual. Prior to the ruling, several hundred rabbis, including Rabbi David Niederman, the executive director of the United Jewish Organization of Williamsburg, signed a declaration stating that they would not inform parents of the potential dangers that came with metzitzah b'peh, even if informed consent became law. In a motion for preliminary injunction with intent to sue, filed against New York City Department of Health & Mental Hygiene, affidavits by Awi Federgruen, Brenda Breuer, and Daniel S. Berman argued that the study on which the department passed its conclusions is flawed. The "informed consent" regulation was challenged in court. In January 2013 the U.S. District court ruled that the law did not specifically target religion and therefore must not pass strict scrutiny. The ruling was appealed to the Court of Appeals. On August 15, 2014, the Second Circuit Court of Appeals reversed the decision by the lower court, and ruled that the regulation does have to be reviewed under strict scrutiny to determine whether it infringes on Orthodox Jews' freedom of religion. On September 9, 2015, after coming to an agreement with the community the New York City Board of Health voted to repeal the informed consent regulation. "Hatafat dam brit". A "brit milah" is more than circumcision; it is a sacred ritual in Judaism, as distinguished from its non-ritual requirement in Islam. One ramification is that the "brit" is not considered complete unless a drop of blood is actually drawn. The standard medical methods of circumcision through constriction do not meet the requirements of the "halakhah" for "brit milah", because they are done with hemostasis, "i.e.", they stop the flow of blood. Moreover, circumcision alone, in the absence of the "brit milah" ceremony, does not fulfill the requirements of the "mitzvah". Therefore, in cases involving a Jew who was circumcised outside of a brit milah, an already-circumcised convert, or an aposthetic (born without a foreskin) individual, the mohel draws a symbolic drop of blood (, ) from the penis at the point where the foreskin would have been or was attached. "Milah L'shem Giur". A "milah l'shem giur" is a "circumcision for the purpose of conversion". In Orthodox Judaism, this procedure is usually done by adoptive parents for adopted boys who are being converted as part of the adoption or by families with young children converting together. It is also required for adult converts who were not previously circumcised, e.g., those born in countries where circumcision at birth is not common. The conversion of a minor is valid in both Orthodox and Conservative Judaism until a child reaches the age of majority (13 for a boy, 12 for a girl); at that time the child has the option of renouncing his conversion and Judaism, and the conversion will then be considered retroactively invalid. He must be informed of his right to renounce his conversion if he wishes. If he does not make such a statement, it is accepted that the boy is halakhically Jewish. Orthodox rabbis will generally not convert a non-Jewish child raised by a mother who has not converted to Judaism. The laws of conversion and conversion-related circumcision in Orthodox Judaism have numerous complications, and authorities recommend that a rabbi be consulted well in advance. In Conservative Judaism, the milah l'shem giur procedure is also performed for a boy whose mother has not converted, but with the intention that the child be raised Jewish. This conversion of a child to Judaism without the conversion of the mother is allowed by Conservative interpretations of halakha. Conservative Rabbis will authorize it only under the condition that the child be raised as a Jew in a single-faith household. Should the mother convert, and if the boy has not yet reached his third birthday, the child may be immersed in the mikveh with the mother, after the mother has already immersed, to become Jewish. If the mother does not convert, the child may be immersed in a mikveh, or body of natural waters, to complete the child's conversion to Judaism. This can be done before the child is even one year old. If the child did not immerse in the mikveh, or the boy was too old, then the child may choose of their own accord to become Jewish at age 13 as a Bar Mitzvah, and complete the conversion then. Where the procedure was performed but not followed by immersion or other requirements of the conversion procedure (e.g., in Conservative Judaism, where the mother has not converted), if the boy chooses to complete the conversion at Bar Mitzvah, a "milah l'shem giur" performed when the boy was an infant removes the obligation to undergo either a full brit milah or "hatafat dam brit". Visible symbol of a covenant. Rabbi Saadia Gaon considers something to be "complete" if it lacks nothing, but also has nothing that is unneeded. He regards the foreskin as an unneeded organ that God created in man, and so by amputating it, the man is completed. The author of Sefer ha-Chinuch provides three reasons for the practice of circumcision: Talmud professor Daniel Boyarin offered two explanations for circumcision. One is that it is a literal inscription on the Jewish body of the name of God in the form of the letter "yud" (from "yesod"). The second is that the act of bleeding represents a feminization of Jewish men, significant in the sense that the covenant represents a marriage between Jews and (a symbolically male) God. Other reasons. In "Of the Special Laws, Book 1", the Jewish philosopher Philo additionally gave other reasons for the practice of circumcision. He attributes four of the reasons to "men of divine spirit and wisdom". These include the idea that circumcision: Judaism, Christianity, and the Early Church (4 BCE – 150 CE). The 1st-century Jewish author Philo Judaeus defended Jewish circumcision on several grounds. He thought that circumcision should be done as early as possible as it would not be as likely to be done by someone's own free will. He claimed that the foreskin prevented semen from reaching the vagina and so should be done as a way to increase the nation's population. He also noted that circumcision should be performed as an effective means to reduce sexual pleasure. There was also division in Pharisaic Judaism between Hillel the Elder and Shammai on the issue of circumcision of proselytes. According to the Gospel of Luke, Jesus was circumcised on the 8th day. According to saying 53 of the Gospel of Thomas, The foreskin was considered a symbol of beauty, civility, and masculinity throughout the Greco-Roman world; it was customary to spend an hour or so a day exercising nude in the "gymnasium" and in Roman baths; many Jewish men did not want to be seen in public deprived of their foreskins, where matters of business and politics were discussed. To expose one's glans in public was seen as indecent, vulgar, and a sign of sexual arousal and desire. Classical, Hellenistic, and Roman culture widely found circumcision to be barbaric, cruel, and utterly repulsive in nature. By the period of the Maccabees, many Jewish men attempted to hide their circumcisions through the process of epispasm due to the circumstances of the period, although Jewish religious writers denounced these practices as abrogating the covenant of Abraham in 1 Maccabees and the Talmud. After Christianity and Second Temple Judaism split apart from one another, "Milah" was declared spiritually unnecessary as a condition of justification by Christian writers such as Paul the Apostle and subsequently in the Council of Jerusalem, while it further increased in importance for Jews. In the mid-2nd century CE, the Tannaim, the successors of the newly ideologically dominant Pharisees, introduced and made mandatory a secondary step of circumcision known as the "Periah." Without it circumcision was newly declared to have no spiritual value. This new form removed as much of the inner mucosa as possible, the frenulum and its corresponding delta from the penis, and prevented the movement of shaft skin, in what creates a "low and tight" circumcision. It was intended to make it impossible to restore the foreskin. This is the form practiced among the large majority of Jews today, and, later, became a basis for the routine neonatal circumcisions performed in the United States. The steps, justifications, and imposition of the practice have dramatically varied throughout history; commonly cited reasons for the practice have included it being a way to control male sexuality by reducing sexual pleasure and desire, as a visual marker of the covenant of the pieces, as a metaphor for mankind perfecting creation, and as a means to promote fertility. The original version in Judaic history was either a ritual nick or cut done by a father to the "acroposthion", the part of the foreskin that overhangs the glans penis. This form of genital nicking or cutting, known as simply "milah," became adopted among Jews by the Second Temple period and was the predominant form until the second century CE. The notion of "milah" being linked to a biblical covenant is generally believed to have originated in the 6th century BCE as a product of the Babylonian captivity; the practice likely lacked this significance among Jews before the period. Reform Judaism. The Reform societies established in Frankfurt and Berlin regarded circumcision as barbaric and wished to abolish it. However, while prominent rabbis such as Abraham Geiger believed the ritual to be barbaric and outdated, they refrained from instituting any change in this matter. In 1843, when a father in Frankfurt refused to circumcise his son, rabbis of all shades in Germany stated it was mandated by Jewish law; even Samuel Holdheim affirmed this. By 1871, Reform rabbinic leadership in Germany reasserted "the supreme importance of circumcision in Judaism", while affirming the traditional viewpoint that non-circumcised Jews are Jews nonetheless. Although the issue of circumcision of converts continues to be debated, the necessity of "britot milah" for Jewish infant boys has been stressed in every subsequent Reform rabbis manual or guide. While the Reform movement does not require the circumcision of adult male converts, it is increasingly acknowledged and practiced by many Reform communities as an important part of the conversion process. Since 1984 Reform Judaism has trained and certified over 300 of their own practicing "mohalim" in this ritual. By 2001, the Central Conference of American Rabbis began to recommend that male converts who are already circumcised undergo "hatafat dam brit". In Samaritanism. Samaritan "brit milah" occurs on the eighth day following the child's birth at the father's home. In addition to special prayers and readings from the Torah pertaining to the ritual, an old hymn that invokes blessings for parents and children is sung. According to British explorer Claude Reignier Conder, in their circumcision hymn, Samaritans express their gratitude for a certain Roman soldier by the name of Germon, who was sent by an unknown Roman Emperor as a sentinel over the home of the Samaritan High Priest for his kindness in allowing the process of circumcision to take place. They tried to give him money, but he refused, just requesting to be included in their future prayers instead.
4770
7903804
https://en.wikipedia.org/wiki?curid=4770
Business ethics
Business ethics (also known as corporate ethics) is a form of applied ethics or professional ethics, that examines ethical principles and moral or ethical problems that can arise in a business environment. It applies to all aspects of business conduct and is relevant to the conduct of individuals and entire organizations. These ethics originate from individuals, organizational statements or the legal system. These norms, values, ethical, and unethical practices are the principles that guide a business. Business ethics refers to contemporary organizational standards, principles, sets of values and norms that govern the actions and behavior of an individual in the business organization. Business ethics have two dimensions, normative business ethics or descriptive business ethics. As a corporate practice and a career specialization, the field is primarily normative. Academics attempting to understand business behavior employ descriptive methods. The range and quantity of business ethical issues reflect the interaction of profit-maximizing behavior with non-economic concerns. Interest in business ethics accelerated dramatically during the 1980s and 1990s, both within major corporations and within academia. For example, most major corporations today promote their commitment to non-economic values under headings such as ethics codes and social responsibility charters. Adam Smith said in 1776, "People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices." Governments use laws and regulations to point business behavior in what they perceive to be beneficial directions. Ethics implicitly regulates areas and details of behavior that lie beyond governmental control. The emergence of large corporations with limited relationships and sensitivity to the communities in which they operate accelerated the development of formal ethics regimes. Maintaining an ethical status is the responsibility of the manager of the business. According to a 1990 article in the "Journal of Business Ethics", "Managing ethical behavior is one of the most pervasive and complex problems facing business organizations today." History. Business ethics reflect the norms of each historical period. As time passes, norms evolve, causing accepted behaviors to become objectionable. Business ethics and the resulting behavior evolved as well. Business was involved in slavery, colonialism, and the Cold War. The term 'business ethics' came into common use in the United States in the early 1970s. By the mid-1980s at least 500 courses in business ethics reached 40,000 students, using some twenty textbooks and at least ten casebooks supported by professional societies, centers and journals of business ethics. The Society for Business Ethics was founded in 1980. European business schools adopted business ethics after 1987 commencing with the European Business Ethics Network. In 1982 the first single-authored books in the field appeared. Firms began highlighting their ethical stature in the late 1980s and early 1990s, possibly in an attempt to distance themselves from the business scandals of the day, such as the savings and loan crisis. The concept of business ethics caught the attention of academics, media and business firms by the end of the Cold War. However, criticism of business practices was attacked for infringing the freedom of entrepreneurs and critics were accused of supporting communists. This scuttled the discourse of business ethics both in media and academia. The Defense Industry Initiative on Business Ethics and Conduct (DII) was created to support corporate ethical conduct. This era began the belief and support of self-regulation and free trade, which lifted tariffs and barriers and allowed businesses to merge and divest in an increasing global atmosphere. Religious and philosophical origins. One of the earliest written treatments of business ethics is found in the "Tirukkuṛaḷ", a Tamil book dated variously from 300 BCE to the 7th century CE and attributed to Thiruvalluvar. Many verses discuss business ethics, in particular, verse 113, adapting to a changing environment in verses 474, 426, and 140, learning the intricacies of different tasks in verses 462 and 677. Overview. Business ethics reflects the philosophy of business, of which one aim is to determine the fundamental purposes of a company. Business purpose expresses the company's reason for existing. Modern discussion on the purpose of business has been freshened by views from thinkers such as Richard R. Ellesworth, Peter Drucker, and Nikos Mourkogiannis: Earlier views such as Milton Friedman's held that the purpose of a business organization is to make a profit for shareholders. Nevertheless, the purpose of maximizing shareholder's wealth often "fails to energize employees". In practice, many non-shareholders also benefit from a firm's economic activity, among them employees through contractual compensation and its broader impact, consumers by the tangible or non-tangible value derived from their purchase choices; society as a whole through taxation and/or the company's involvement in social action when it occurs. On the other hand, if a company's purpose is to maximize shareholder returns, then sacrificing profits for other concerns is a violation of its fiduciary responsibility. Corporate entities are legal persons but this does not mean they are legally entitled to all of the rights and liabilities as natural persons. Ethics are the rules or standards that govern our decisions on a daily basis. Many consider "ethics" with conscience or a simplistic sense of "right" and "wrong". Others would say that ethics is an internal code that governs an individual's conduct, ingrained into each person by family, faith, tradition, community, laws, and personal mores. Corporations and professional organizations, particularly licensing boards, generally will have a written code of ethics that governs standards of professional conduct expected of all in the field. It is important to note that "law" and "ethics" are not synonymous, nor are the "legal" and "ethical" courses of action in a given situation necessarily the same. Statutes and regulations passed by legislative bodies and administrative boards set forth the "law". Slavery once was legal in the US, but one certainly would not say enslaving another was an "ethical" act. Economist Milton Friedman wrote that corporate executives' "responsibility ... generally will be to make as much money as possible while conforming to their basic rules of the society, both those embodied in law and those embodied in ethical custom". Friedman also said, "the only entities who can have responsibilities are individuals ... A business cannot have responsibilities. So the question is, do corporate executives, provided they stay within the law, have responsibilities in their business activities other than to make as much money for their stockholders as possible? And my answer to that is, no, they do not." This view is known as the Friedman doctrine. A multi-country 2011 survey found support for this view among the "informed public" ranging from 30 to 80%. Ronald Duska and Jacques Cory have described Friedman's argument as consequentialist or utilitarian rather than pragmatic: Friedman's argument implies that unrestrained corporate freedom would benefit the most people in the long term. Duska argued that Friedman failed to differentiate two very different aspects of business: (1) the "motive" of individuals, who are generally motivated by profit to participate in business, and (2) the socially sanctioned "purpose" of business, or the reason why people allow businesses to exist, which is to provide goods and services to people. So Friedman was wrong that making a profit is the only concern of business, Duska argued. Peter Drucker once said, "There is neither a separate ethics of business nor is one needed", implying that standards of personal ethics cover all business situations. However, Drucker in another instance said that the ultimate responsibility of company directors is not to harm—"primum non nocere". Philosopher and author Ayn Rand has put forth her idea of rational egoism, which also applies to business ethics. She stresses that the position of the entrepreneur, who has to be responsible for his own happiness and the business is a means to said happiness, where the entrepreneur is not required to serve the interest of anyone else and no one is entitled to his/her work. Another view of business is that it must exhibit corporate social responsibility (CSR): an umbrella term indicating that an ethical business must act as a responsible citizen of the communities in which it operates even at the cost of profits or other goals. In the US and most other nations, corporate entities are legally treated as persons in some respects. For example, they can hold title to property, sue and be sued and are subject to taxation, although their free speech rights are limited. This can be interpreted to imply that they have independent ethical responsibilities. Duska argued that stakeholders expect a business to be ethical and that violating that expectation must be counterproductive for the business. Ethical issues include the rights and duties between a company and its employees, suppliers, customers and neighbors, its fiduciary responsibility to its shareholders. Issues concerning relations between different companies include hostile takeovers and industrial espionage. Related issues include corporate governance; corporate social entrepreneurship; political contributions; legal issues such as the ethical debate over introducing a crime of corporate manslaughter; and the marketing of corporations' ethics policies. According to research published by the Institute of Business Ethics and Ipsos MORI in late 2012, the three major areas of public concern regarding business ethics in Britain are executive pay, corporate tax avoidance and bribery and corruption. The ethical standards of an entire organization can be damaged if a corporate psychopath is in charge. This will affect not only the company and its outcome but the employees who work under a corporate psychopath. The way a corporate psychopath can rise in a company is by their manipulation, scheming, and bullying. They do this in a way that can hide their true character and intentions within a company. Functional business areas. Finance. Fundamentally, finance is a social science discipline. The discipline borders behavioral economics, sociology, economics, accounting and management. It concerns technical issues such as the mix of debt and equity, dividend policy, the evaluation of alternative investment projects, options, futures, swaps, and other derivatives, portfolio diversification and many others. Finance is often mistaken by people to be a discipline free from ethical burdens. The 2008 financial crisis caused critics to challenge the ethics of the executives in charge of U.S. and European financial institutions and financial regulatory bodies. Finance ethics is overlooked for another reason—issues in finance are often addressed as matters of law rather than ethics. Finance paradigm. Aristotle said, "the end and purpose of the polis is the good life". Adam Smith characterized the good life in terms of material goods and intellectual and moral excellences of character. Smith in his "The Wealth of Nations" commented, "All for ourselves, and nothing for other people, seems, in every age of the world, to have been the vile maxim of the masters of mankind." However, a section of economists influenced by the ideology of neoliberalism, interpreted the objective of economics to be maximization of economic growth through accelerated consumption and production of goods and services. Neoliberal ideology promoted finance from its position as a component of economics to its core. Proponents of the ideology hold that unrestricted financial flows, if redeemed from the shackles of "financial repressions", best help impoverished nations to grow. The theory holds that open financial systems accelerate economic growth by encouraging foreign capital inflows, thereby enabling higher levels of savings, investment, employment, productivity and "welfare", along with containing corruption. Neoliberals recommended that governments open their financial systems to the global market with minimal regulation over capital flows. The recommendations however, met with criticisms from various schools of ethical philosophy. Some pragmatic ethicists, found these claims to be unfalsifiable and a priori, although neither of these makes the recommendations false or unethical per se. Raising economic growth to the highest value necessarily means that welfare is subordinate, although advocates dispute this saying that economic growth provides more welfare than known alternatives. Since history shows that neither regulated nor unregulated firms always behave ethically, neither regime offers an ethical panacea. Neoliberal recommendations to developing countries to unconditionally open up their economies to transnational finance corporations were fiercely contested by some ethicists. The claim that deregulation and the opening up of economies would reduce corruption was also contested. Dobson observes, "a rational agent is simply one who pursues personal material advantage ad infinitum. In essence, to be rational in finance is to be individualistic, materialistic, and competitive. Business is a game played by individuals, as with all games the object is to win, and winning is measured in terms solely of material wealth. Within the discipline, this rationality concept is never questioned, and has indeed become the theory-of-the-firm's sine qua non". Financial ethics is in this view a mathematical function of shareholder wealth. Such simplifying assumptions were once necessary for the construction of mathematically robust models. However, signalling theory and agency theory extended the paradigm to greater realism. Other issues. Fairness in trading practices, trading conditions, financial contracting, sales practices, consultancy services, tax payments, internal audit, external audit and executive compensation also, fall under the umbrella of finance and accounting. Particular corporate ethical/legal abuses include: creative accounting, earnings management, misleading financial analysis, insider trading, securities fraud, bribery/kickbacks and facilitation payments. Outside of corporations, bucket shops and forex scams are criminal manipulations of financial markets. Cases include accounting scandals, Enron, WorldCom and Satyam. Human resource management. Human resource management occupies the sphere of activity of recruitment selection, orientation, performance appraisal, training and development, industrial relations and health and safety issues. Business Ethicists differ in their orientation towards labor ethics. Some assess human resource policies according to whether they support an egalitarian workplace and the dignity of labor. Issues including employment itself, privacy, compensation in accord with comparable worth, collective bargaining (and/or its opposite) can be seen either as inalienable rights or as negotiable. Discrimination by age (preferring the young or the old), gender/sexual harassment, race, religion, disability, weight and attractiveness. A common approach to remedying discrimination is affirmative action. Once hired, employees have the right to the occasional cost of living increases, as well as raises based on merit. Promotions, however, are not a right, and there are often fewer openings than qualified applicants. It may seem unfair if an employee who has been with a company longer is passed over for a promotion, but it is not unethical. It is only unethical if the employer did not give the employee proper consideration or used improper criteria for the promotion. Each employer should know the distinction between what is unethical and what is illegal. If an action is illegal it is breaking the law but if an action seems morally incorrect that is unethical. In the workplace what is unethical does not mean illegal and should follow the guidelines put in place by OSHA (Occupational Safety and Health Administration), EEOC (Equal Employment Opportunity Commission), and other law-binding entities. Potential employees have ethical obligations to employers, involving intellectual property protection and whistle-blowing. Employers must consider workplace safety, which may involve modifying the workplace, or providing appropriate training or hazard disclosure. This differentiates on the location and type of work that is taking place and can need to comply with the standards to protect employees and non-employees under workplace safety. Larger economic issues such as immigration, trade policy, globalization and trade unionism affect workplaces and have an ethical dimension, but are often beyond the purview of individual companies. Trade unions. Trade unions, for example, may push employers to establish due process for workers, but may also cause job loss by demanding unsustainable compensation and work rules. Unionized workplaces may confront union busting and strike breaking and face the ethical implications of work rules that advantage some workers over others. Management strategy. Among the many people management strategies that companies employ are a "soft" approach that regards employees as a source of creative energy and participants in workplace decision-making, a "hard" version explicitly focused on control and Theory Z that emphasizes philosophy, culture and consensus. None ensure ethical behavior. Some studies claim that sustainable success requires a humanely treated and satisfied workforce. Procurement. In the procurement field, ethical behaviour includes adherence to the principle of fairness, avoidance of preferential treatment, and the elimination of malpractice in relation to suppliers and their workforces. The Chartered Institute of Procurement & Supply notes that "it's important to recognise that being ethical isn't just within the business community but consumers can also make ethical choices in their buying decisions". Sales and marketing. Marketing ethics came of age only as late as the 1990s. Marketing ethics was approached from ethical perspectives of virtue or virtue ethics, deontology, consequentialism, pragmatism and relativism. Ethics in marketing deals with the principles, values and/or ideas by which marketers (and marketing institutions) ought to act. Marketing ethics is also contested terrain, beyond the previously described issue of potential conflicts between profitability and other concerns. Ethical marketing issues include marketing redundant or dangerous products/services, transparency about environmental risks, transparency about product ingredients such as genetically modified organisms possible health risks, financial risks, security risks, etc., respect for consumer privacy and autonomy, advertising truthfulness and fairness in pricing & distribution. According to Borgerson, and Schroeder (2008), marketing can influence individuals' perceptions of and interactions with other people, implying an ethical responsibility to avoid distorting those perceptions and interactions. Marketing ethics involves pricing practices, including illegal actions such as price fixing and legal actions including price discrimination and price skimming. Certain promotional activities have drawn fire, including greenwashing, bait and switch, shilling, viral marketing, spam (electronic), pyramid schemes and multi-level marketing. Advertising has raised objections about attack ads, subliminal messages, sex in advertising and marketing in schools. Inter-organizational relationships. Scholars in business and management have paid much attention to the ethical issues in the different forms of relationships between organizations such as buyer-supplier relationships, networks, alliances, or joint ventures. Drawing in particular on transaction cost theory and agency theory, they note the risk of opportunistic and unethical practices between partners through, for instance, shirking, poaching, and other deceitful behaviors. In turn, research on inter-organizational relationships has observed the role of formal and informal mechanisms to both prevent unethical practices and mitigate their consequences. It especially discusses the importance of formal contracts and relational norms between partners to manage ethical issues. Emerging issues. Being the most important element of a business, stakeholders' main concern is to determine whether or not the business is behaving ethically or unethically. The business's actions and decisions should be primarily ethical before it happens to become an ethical or even legal issue. "In the case of the government, community, and society what was merely an ethical issue can become a legal debate and eventually law." Some emerging ethical issues are: Production. This area of business ethics usually deals with the duties of a company to ensure that products and production processes do not needlessly cause harm. Since few goods and services can be produced and consumed with zero risks, determining the ethical course can be difficult. In some case, consumers demand products that harm them, such as tobacco products. Production may have environmental impacts, including pollution, habitat destruction and urban sprawl. The downstream effects of technologies nuclear power, genetically modified food and mobile phones may not be well understood. While the precautionary principle may prohibit introducing new technology whose consequences are not fully understood, that principle would have prohibited the newest technology introduced since the Industrial Revolution. Product testing protocols have been attacked for violating the rights of both humans and animals. There are sources that provide information on companies that are environmentally responsible or do not test on animals. Property. The etymological root of property is the Latin , which refers to 'nature', 'quality', 'one's own', 'special characteristic', 'proper', 'intrinsic', 'inherent', 'regular', 'normal', 'genuine', 'thorough, complete, perfect' etc. The word property is value loaded and associated with the personal qualities of propriety and respectability, also implies questions relating to ownership. A 'proper' person owns and is true to herself or himself, and is thus genuine, perfect and pure. Modern history of property rights. Modern discourse on property emerged by the turn of the 17th century within theological discussions of that time. For instance, John Locke justified property rights saying that God had made "the earth, and all inferior creatures, [in] common to all men". In 1802 utilitarian Jeremy Bentham stated, "property and law are born together and die together". One argument for property ownership is that it enhances individual liberty by extending the line of non-interference by the state or others around the person. Seen from this perspective, property right is absolute and property has a special and distinctive character that precedes its legal protection. Blackstone conceptualized property as the "sole and despotic dominion which one man claims and exercises over the external things of the world, in total exclusion of the right of any other individual in the universe". Slaves as property. During the seventeenth and eighteenth centuries, slavery spread to European colonies including America, where colonial legislatures defined the legal status of slaves as a form of property. Combined with theological justification, the property was taken to be essentially natural ordained by God. Property, which later gained meaning as ownership and appeared natural to Locke, Jefferson and to many of the 18th and 19th century intellectuals as land, labor or idea, and property right over slaves had the same theological and essentialized justification It was even held that the property in slaves was a sacred right. Wiecek says, "Yet slavery was more clearly and explicitly established under the Constitution than it had been under the Articles". In an 1857 judgment, US Supreme Court Chief Justice Roger B. Taney said, "The right of property in a slave is distinctly and expressly affirmed in the Constitution." Natural right vs social construct. Neoliberals hold that private property rights are a non-negotiable natural right. Davies counters with "property is no different from other legal categories in that it is simply a consequence of the significance attached by law to the relationships between legal persons." Singer claims, "Property is a form of power, and the distribution of power is a political problem of the highest order". Rose finds, Property' is only an effect, a construction, of relationships between people, meaning that its objective character is contestable. Persons and things, are 'constituted' or 'fabricated' by legal and other normative techniques." Singer observes, "A private property regime is not, after all, a Hobbesian state of nature; it requires a working legal system that can define, allocate, and enforce property rights." Davis claims that common law theory generally favors the view that "property is not essentially a 'right to a thing', but rather a separable bundle of rights subsisting between persons which may vary according to the context and the object which is at stake". In common parlance property rights involve a bundle of rights including occupancy, use and enjoyment, and the right to sell, devise, give, or lease all or part of these rights. Custodians of property have obligations as well as rights. Michelman writes, "A property regime thus depends on a great deal of cooperation, trustworthiness, and self-restraint among the people who enjoy it." Menon claims that the autonomous individual, responsible for his/her own existence is a cultural construct moulded by Western culture rather than the truth about the human condition. Penner views property as an "illusion"—a "normative phantasm" without substance. In the neoliberal literature, the property is part of the private side of a public/private dichotomy and acts a counterweight to state power. Davies counters that "any space may be subject to plural meanings or appropriations which do not necessarily come into conflict". Private property has never been a universal doctrine, although since the end of the Cold War is it has become nearly so. Some societies, e.g., Native American bands, held land, if not all property, in common. When groups came into conflict, the victor often appropriated the loser's property. The rights paradigm tended to stabilize the distribution of property holdings on the presumption that title had been lawfully acquired. Property does not exist in isolation, and so property rights too. Bryan claimed that property rights describe relations among people and not just relations between people and things Singer holds that the idea that owners have no legal obligations to others wrongly supposes that property rights hardly ever conflict with other legally protected interests. Singer continues implying that legal realists "did not take the character and structure of social relations as an important independent factor in choosing the rules that govern market life". Ethics of property rights begins with recognizing the vacuous nature of the notion of property. Intellectual property. Intellectual property (IP) encompasses expressions of ideas, thoughts, codes, and information. "Intellectual property rights" (IPR) treat IP as a kind of real property, subject to analogous protections, rather than as a reproducible good or service. Boldrin and Levine argue that "government does not ordinarily enforce monopolies for producers of other goods. This is because it is widely recognized that monopoly creates many social costs. Intellectual monopoly is no different in this respect. The question we address is whether it also creates social benefits commensurate with these social costs." International standards relating to intellectual property rights are enforced through Agreement on Trade-Related Aspects of Intellectual Property Rights. In the US, IP other than copyrights is regulated by the United States Patent and Trademark Office. The US Constitution included the power to protect intellectual property, empowering the Federal government "to promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries". Boldrin and Levine see no value in such state-enforced monopolies stating, "we ordinarily think of innovative monopoly as an oxymoron. Further, they comment, 'intellectual property' "is not like ordinary property at all, but constitutes a government grant of a costly and dangerous private monopoly over ideas. We show through theory and example that intellectual monopoly is not necessary for innovation and as a practical matter is damaging to growth, prosperity, and liberty". Steelman defends patent monopolies, writing, "Consider prescription drugs, for instance. Such drugs have benefited millions of people, improving or extending their lives. Patent protection enables drug companies to recoup their development costs because for a specific period of time they have the sole right to manufacture and distribute the products they have invented." The court cases by 39 pharmaceutical companies against South Africa's 1997 Medicines and Related Substances Control Amendment Act, which intended to provide affordable HIV medicines has been cited as a harmful effect of patents. One attack on IPR is moral rather than utilitarian, claiming that inventions are mostly a collective, cumulative, path dependent, social creation and therefore, no one person or firm should be able to monopolize them even for a limited period. The opposing argument is that the benefits of innovation arrive sooner when patents encourage innovators and their investors to increase their commitments. Roderick T. Long, a libertarian philosopher, argued: Machlup concluded that patents do not have the intended effect of enhancing innovation. Self-declared anarchist Proudhon, in his 1847 seminal work noted, "Monopoly is the natural opposite of competition," and continued, "Competition is the vital force which animates the collective being: to destroy it, if such a supposition were possible, would be to kill society." Mindeli and Pipiya argued that the knowledge economy is an economy of abundance because it relies on the "infinite potential" of knowledge and ideas rather than on the limited resources of natural resources, labor and capital. Allison envisioned an egalitarian distribution of knowledge. Kinsella claimed that IPR create artificial scarcity and reduce equality. Bouckaert wrote, "Natural scarcity is that which follows from the relationship between man and nature. Scarcity is natural when it is possible to conceive of it before any human, institutional, contractual arrangement. Artificial scarcity, on the other hand, is the outcome of such arrangements. Artificial scarcity can hardly serve as a justification for the legal framework that causes that scarcity. Such an argument would be completely circular. On the contrary, artificial scarcity itself needs a justification" Corporations fund much IP creation and can acquire IP they do not create, to which Menon and others have objected. Andersen claims that IPR has increasingly become an instrument in eroding public domain. Ethical and legal issues include patent infringement, copyright infringement, trademark infringement, patent and copyright misuse, submarine patents, biological patents, patent, copyright and trademark trolling, employee raiding and monopolizing talent, bioprospecting, biopiracy and industrial espionage, digital rights management. Notable IP copyright cases include "A&M Records, Inc. v. Napster, Inc.", "Eldred v. Ashcroft", and Disney's lawsuit against the Air Pirates. International issues. While business ethics emerged as a field in the 1970s, international business ethics did not emerge until the late 1990s, looking back on the international developments of that decade. Many new practical issues arose out of the international context of business. Theoretical issues such as cultural relativity of ethical values receive more emphasis in this field. Other, older issues can be grouped here as well. Issues and subfields include: Foreign countries often use dumping as a competitive threat, selling products at prices lower than their normal value. This can lead to problems in domestic markets. It becomes difficult for these markets to compete with the pricing set by foreign markets. In 2009, the International Trade Commission has been researching anti-dumping laws. Dumping is often seen as an ethical issue, as larger companies are taking advantage of other less economically advanced companies. Issues. Ethical issues often arise in business settings, whether through business transactions or forming new business relationships. It also has a huge focus in the auditing field whereby the type of verification can be directly dictated by ethical theory. An ethical issue in a business atmosphere may refer to any situation that requires business associates as individuals, or as a group (for example, a department or firm) to evaluate the morality of specific actions, and subsequently, make a decision amongst the choices. Some ethical issues of particular concern in today's evolving business market include such topics as: honesty, integrity, professional behaviors, environmental issues, harassment, and fraud to name a few. From a 2009 National Business Ethics survey, it was found that types of employee-observed ethical misconduct included abusive behavior (at a rate of 22 percent), discrimination (at a rate of 14 percent), improper hiring practices (at a rate of 10 percent), and company resource abuse (at a rate of percent). The ethical issues associated with honesty are widespread and vary greatly in business, from the misuse of company time or resources to lying with malicious intent, engaging in bribery, or creating conflicts of interest within an organization. Honesty encompasses wholly the truthful speech and actions of an individual. Some cultures and belief systems even consider honesty to be an essential pillar of life, such as Confucianism and Buddhism (referred to as sacca, part of the Four Noble Truths). Many employees lie in order to reach goals, avoid assignments or negative issues; however, sacrificing honesty in order to gain status or reap rewards poses potential problems for the overall ethical culture organization, and jeopardizes organizational goals in the long run. Using company time or resources for personal use is also, commonly viewed as unethical because it boils down to stealing from the company. The misuse of resources costs companies billions of dollars each year, averaging about 4.25 hours per week of stolen time alone, and employees' abuse of Internet services is another main concern. Bribery, on the other hand, is not only considered unethical is business practices, but it is also illegal. In accordance with this, the Foreign Corrupt Practices Act was established in 1977 to deter international businesses from giving or receiving unwarranted payments and gifts that were intended to influence the decisions of executives and political officials. Although, small payments known as facilitation payments will not be considered unlawful under the Foreign Corrupt Practices Act if they are used towards regular public governance activities, such as permits or licenses. Influential factors on business ethics. Many aspects of the work environment influence an individual's decision-making regarding ethics in the business world. When an individual is on the path of growing a company, many outside influences can pressure them to perform a certain way. The core of the person's performance in the workplace is rooted in their personal code of behavior. A person's personal code of ethics encompasses many different qualities such as integrity, honesty, communication, respect, compassion, and common goals. In addition, the ethical standards set forth by a person's superior(s) often translate into their own code of ethics. The company's policy is the 'umbrella' of ethics that play a major role in the personal development and decision-making processes that people make with respect to ethical behavior. The ethics of a company and its individuals are heavily influenced by the state of their country. If a country is heavily plagued with poverty, large corporations continuously grow, but smaller companies begin to wither and are then forced to adapt and scavenge for any method of survival. As a result, the leadership of the company is often tempted to participate in unethical methods to obtain new business opportunities. Additionally, Social Media is arguably the most influential factor in ethics. The immediate access to so much information and the opinions of millions highly influence people's behaviors. The desire to conform with what is portrayed as the norm often manipulates our idea of what is morally and ethically sound. Popular trends on social media and the instant gratification that is received from participating in such quickly distort people's ideas and decisions. Economic systems. Political economy and political philosophy have ethical implications, particularly regarding the distribution of economic benefits. John Rawls and Robert Nozick are both notable contributors. For example, Rawls has been interpreted as offering a critique of offshore outsourcing on social contract grounds. Law and regulation. Laws are the written statutes, codes, and opinions of government organizations by which citizens, businesses, and persons present within a jurisdiction are expected to govern themselves or face legal sanction. Sanctions for violating the law can include (a) civil penalties, such as fines, pecuniary damages, and loss of licenses, property, rights, or privileges; (b) criminal penalties, such as fines, probation, imprisonment, or a combination thereof; or (c) both civil and criminal penalties. Very often it is held that business is not bound by any ethics other than abiding by the law. Milton Friedman is the pioneer of the view. He held that corporations have the obligation to make a profit within the framework of the legal system, nothing more. Friedman made it explicit that the duty of the business leaders is, "to make as much money as possible while conforming to the basic rules of the society, both those embodied in the law and those embodied in ethical custom". Ethics for Friedman is nothing more than abiding by customs and laws. The reduction of ethics to abidance to laws and customs, however, have drawn serious criticisms. Counter to Friedman's logic it is observed that legal procedures are technocratic, bureaucratic, rigid and obligatory whereas ethical act is conscientious, voluntary choice beyond normativity. Law is retroactive. Crime precedes law. Law against crime, to be passed, the crime must have happened. Laws are blind to the crimes undefined in it. Further, as per law, "conduct is not criminal unless forbidden by law which gives advance warning that such conduct is criminal". Also, the law presumes the accused is innocent until proven guilty and that the state must establish the guilt of the accused beyond reasonable doubt. As per liberal laws followed in most of the democracies, until the government prosecutor proves the firm guilty with the limited resources available to her, the accused is considered to be innocent. Though the liberal premises of law is necessary to protect individuals from being persecuted by Government, it is not a sufficient mechanism to make firms morally accountable. Implementation. Corporate policies. As part of more comprehensive compliance and ethics programs, many companies have formulated internal policies pertaining to the ethical conduct of employees. These policies can be simple exhortations in broad, highly generalized language (typically called a corporate ethics statement), or they can be more detailed policies, containing specific behavioral requirements (typically called corporate ethics codes). They are generally meant to identify the company's expectations of workers and to offer guidance on handling some of the more common ethical problems that might arise in the course of doing business. It is hoped that having such a policy will lead to greater ethical awareness, consistency in application, and the avoidance of ethical disasters. An increasing number of companies also require employees to attend seminars regarding business conduct, which often include discussion of the company's policies, specific case studies, and legal requirements. Some companies even require their employees to sign agreements stating that they will abide by the company's rules of conduct. Many companies are assessing the environmental factors that can lead employees to engage in unethical conduct. A competitive business environment may call for unethical behavior. Lying has become expected in fields such as trading. An example of this are the issues surrounding the unethical actions of the Salomon Brothers. Not everyone supports corporate policies that govern ethical conduct. Some claim that ethical problems are better dealt with by depending upon employees to use their own judgment. Others believe that corporate ethics policies are primarily rooted in utilitarian concerns and that they are mainly to limit the company's legal liability or to curry public favor by giving the appearance of being a good corporate citizen. Ideally, the company will avoid a lawsuit because its employees will follow the rules. Should a lawsuit occur, the company can claim that the problem would not have arisen if the employee had only followed the code properly. Some corporations have tried to burnish their ethical image by creating whistle-blower protections, such as anonymity. In the case of Citi, they call this the Ethics Hotline. Though it is unclear whether firms such as Citi take offences reported to these hotlines seriously or not. Sometimes there is a disconnection between the company's code of ethics and the company's actual practices. Thus, whether or not such conduct is explicitly sanctioned by management, at worst, this makes the policy duplicitous, and, at best, it is merely a marketing tool. Jones and Parker wrote, "Most of what we read under the name business ethics is either sentimental common sense or a set of excuses for being unpleasant." Many manuals are procedural form filling exercises unconcerned about the real ethical dilemmas. For instance, the US Department of Commerce ethics program treats business ethics as a set of instructions and procedures to be followed by 'ethics officers'., some others claim being ethical is just for the sake of being ethical. Business ethicists may trivialize the subject, offering standard answers that do not reflect the situation's complexity. Richard DeGeorge wrote in regard to the importance of maintaining a corporate code: Ethics codes often state a company's view on DEI (diversity, equity, inclusivity). In 2025, some companies began changing their ethics policies to reduce the focus on DEI, and some companies have decided to complete eliminate DEI from their policies. Ethics officers. Following a series of fraud, corruption, and abuse scandals that affected the United States defense industry in the mid-1980s, the Defense Industry Initiative (DII) was created to promote ethical business practices and ethics management in multiple industries. Subsequent to these scandals, many organizations began appointing ethics officers (also referred to as "compliance" officers). In 1991, the Ethics & Compliance Officer Association —originally the Ethics Officer Association (EOA)—was founded at the Center for Business Ethics at Bentley University as a professional association for ethics and compliance officers. The 1991 passing of the Federal Sentencing Guidelines for Organizations in 1991 was another factor in many companies appointing ethics/compliance officers. These guidelines, intended to assist judges with sentencing, set standards organizations must follow to obtain a reduction in sentence if they should be convicted of a federal offense. Following the high-profile corporate scandals of companies like Enron, WorldCom and Tyco between 2001 and 2004, and following the passage of the Sarbanes–Oxley Act, many small and mid-sized companies also began to appoint ethics officers. Often reporting to the chief executive officer, ethics officers focus on uncovering or preventing unethical and illegal actions. This is accomplished by assessing the ethical implications of the company's activities, making recommendations on ethical policies, and disseminating information to employees. The effectiveness of ethics officers is not clear. The establishment of an ethics officer position is likely to be insufficient in driving ethical business practices without a corporate culture that values ethical behavior. These values and behaviors should be consistently and systemically supported by those at the top of the organization. Employees with strong community involvement, loyalty to employers, superiors or owners, smart work practices, trust among the team members do inculcate a corporate culture. Sustainability initiatives. Many corporate and business strategies now include sustainability. In addition to the traditional environmental 'green' sustainability concerns, business ethics practices have expanded to include social sustainability. Social sustainability focuses on issues related to human capital in the business supply chain, such as worker's rights, working conditions, child labor, and human trafficking. Incorporation of these considerations is increasing, as consumers and procurement officials demand documentation of a business's compliance with national and international initiatives, guidelines, and standards. Many industries have organizations dedicated to verifying ethical delivery of products from start to finish, such as the Kimberly Process, which aims to stop the flow of conflict diamonds into international markets, or the Fair Wear Foundation, dedicated to sustainability and fairness in the garment industry. Initiatives in sustainability encompass "green" topics, as well as social sustainability. Tao "et al". refer to a variety of "green" business practices including green strategy, green design, green production and green operation. There are however many different ways in which sustainability initiatives can be implemented by a company. Improving operations. An organization can implement sustainability initiatives by improving its operations and manufacturing process so as to make it more aligned with environment, social, and governance issues. Johnson & Johnson incorporates policies from the Universal Declaration of Human Rights, International Covenant on Civil and Political Rights and International Covenant on Economic, Social and Cultural Rights, applying these principles not only for members of its supply chain but also internal operations. Walmart has made commitments to doubling its truck fleet efficiency by 2015 by replacing 2/3rds of its fleet with more fuel-efficient trucks, including hybrids. Dell has integrated alternative, recycled, and recyclable materials in its products and packaging design, improving energy efficiency and design for end-of-life and recyclability. Dell plans to reduce the energy intensity of its product portfolio by 80% by 2020. Board leadership. The board of a company can decide to lower executive compensation by a given percentage, and give the percentage of compensation to a specific cause. This is an effort which can only be implemented from the top, as it will affect the compensation of all executives in the company. In Alcoa, an aluminum company based in the US, "1/5th of executive cash compensation is tied to safety, diversity, and environmental stewardship, which includes greenhouse gas emission reductions and energy efficiency" (Best Practices). This is not usually the case for most companies, where we see the board take a uniform step towards the environment, social, and governance issues. This is only the case for companies that are directly linked to utilities, energy, or material industries, something which Alcoa as an aluminum company, falls in line with. Instead, formal committees focused on the environment, social, and governance issues are more usually seen in governance committees and audit committees, rather than the board of directors. "According to research analysis done by Pearl Meyer in support of the NACD 2017 Director Compensation Report shows that among 1,400 public companies reviewed, only slightly more than five percent of boards have a designated committee to address ESG issues." (How compensation can). Management accountability. Similar to board leadership, creating steering committees and other types of committees specialized for sustainability, senior executives are identified who are held accountable for meeting and constantly improving sustainability goals. Executive compensation. Introducing bonus schemes that reward executives for meeting non-financial performance goals including safety targets, greenhouse gas emissions, reduction targets, and goals engaging stakeholders to help shape the companies public policy positions. Companies such as Exelon have implemented policies like this. Stakeholder engagement. Other companies will keep sustainability within its strategy and goals, presenting findings at shareholder meetings, and actively tracking metrics on sustainability. Companies such as PepsiCo, Heineken, and FIFCO take steps in this direction to implement sustainability initiatives. (Best Practices). Companies such as Coca-Cola have actively tried improve their efficiency of water usage, hiring 3rd party auditors to evaluate their water management approach. FIFCO has also led successfully led water-management initiatives. Employee engagement. Implementation of sustainability projects through directly appealing to employees (typically through the human resource department) is another option for companies to implement sustainability. This involves integrating sustainability into the company culture, with hiring practices and employee training. General Electric is a company that is taking the lead in implementing initiatives in this manner. Bank of America directly engaged employees by implement LEED (leadership in Energy and Environmental Design) certified buildings, with a fifth of its building meeting these certifications. Supply chain management. Establishing requirements for not only internal operations but also first-tier suppliers as well as second-tier suppliers to help drive environmental and social expectations further down the supply chain. Companies such as Starbucks, FIFCO and Ford Motor Company have implemented requirements that suppliers must meet to win their business. Starbucks has led efforts in engaging suppliers and local communities where they operate to accelerate investment in sustainable farming. Starbucks set a goal of ethically sourcing 100% of its coffee beans by 2015. Transparency. By revealing decision-making data about how sustainability was reached, companies can give away insights that can help others across the industry and beyond make more sustainable decisions. Nike launched its "making app" in 2013 which released data about the sustainability in the materials it was using. This ultimately allows other companies to make more sustainable design decisions and create lower impact products. Academic discipline. As an academic discipline, business ethics emerged in the 1970s. Since no academic business ethics journals or conferences existed, researchers published in general management journals and attended general conferences. Over time, specialized peer-reviewed journals appeared, and more researchers entered the field. Corporate scandals in the earlier 2000s increased the field's popularity. As of 2009, sixteen academic journals devoted to various business ethics issues existed, with "Journal of Business Ethics" and "Business Ethics Quarterly" considered the leaders. "Journal of Business Ethics Education" publishes articles specifically about education in business ethics. The International Business Development Institute is a global non-profit organization that represents 217 nations and all 50 United States. It offers a Charter in Business Development that focuses on ethical business practices and standards. The Charter is directed by Harvard University, MIT, and Fulbright Scholars, and it includes graduate-level coursework in economics, politics, marketing, management, technology, and legal aspects of business development as it pertains to business ethics. IBDI also oversees the International Business Development Institute of Asia which provides individuals living in 20 Asian nations the opportunity to earn the Charter. Religious views. In Sharia law, followed by many Muslims, banking specifically prohibits charging interest on loans. Traditional Confucian thought discourages profit-seeking. Christianity offers the Golden Rule command, "Therefore all things whatsoever ye would that men should do to you, do ye even so to them: for this is the law and the prophets." According to the article "Theory of the real economy", there is a more narrow point of view from the Christianity faith towards the relationship between ethics and religious traditions. This article stresses how Christianity is capable of establishing reliable boundaries for financial institutions. One criticism comes from Pope Benedict by describing the "damaging effects of the real economy of badly managed and largely speculative financial dealing." It is mentioned that Christianity has the potential to transform the nature of finance and investment but only if theologians and ethicist provide more evidence of what is real in the economic life. Business ethics receives an extensive treatment in Jewish thought and Rabbinic literature, both from an ethical ("Mussar") and a legal ("Halakha") perspective; see article "Jewish business ethics" for further discussion. According to the article "Indian Philosophy and Business Ethics: A Review", by Chandrani Chattopadyay, Hindus follow "Dharma" as Business Ethics and unethical business practices are termed "Adharma". Businessmen are supposed to maintain steady-mindedness, self-purification, non-violence, concentration, clarity and control over senses. Books like Bhagavat Gita and Arthashastra contribute a lot towards conduct of ethical business. Related disciplines. Business ethics is related to philosophy of economics, the branch of philosophy that deals with the philosophical, political, and ethical underpinnings of business and economics. Business ethics operates on the premise, for example, that the ethical operation of a private business is possible—those who dispute that premise, such as libertarian socialists (who contend that "business ethics" is an oxymoron) do so by definition outside of the domain of business ethics proper. The philosophy of economics also deals with questions such as what, if any, are the social responsibilities of a business; business management theory; theories of individualism vs. collectivism; free will among participants in the marketplace; the role of self interest; invisible hand theories; the requirements of social justice; and natural rights, especially property rights, in relation to the business enterprise. Business ethics is also related to political economy, which is economic analysis from political and historical perspectives. Political economy deals with the distributive consequences of economic actions.
4775
48863149
https://en.wikipedia.org/wiki?curid=4775
British Standards
British Standards (BS) are the standards produced by the BSI Group which is incorporated under a royal charter and which is formally designated as the national standards body (NSB) for the UK. The BSI Group produces British Standards under the authority of the charter, which lays down as one of the BSI's objectives to: Formally, as stated in a 2002 memorandum of understanding between the BSI and the United Kingdom Government, British Standards are defined as: Products and services which BSI certifies as having met the requirements of specific standards within designated schemes are awarded the Kitemark. History. BSI Group began in 1901 as the "Engineering Standards Committee", led by James Mansergh, to standardize the number and type of steel sections, in order to make British manufacturers more efficient and competitive. Over time the standards developed to cover many aspects of tangible engineering, and then engineering methodologies including quality systems, safety and security. Creation. The BSI Group as a whole does not produce British Standards, as standards work within the BSI is decentralized. The governing board of BSI establishes a Standards Board. The Standards Board does little apart from setting up sector boards (a sector in BSI parlance being a field of standardization such as ICT, quality, agriculture, manufacturing, or fire). Each sector board, in turn, constitutes several technical committees. It is the technical committees that, formally, approve a British Standard, which is then presented to the secretary of the supervisory sector board for endorsement of the fact that the technical committee has indeed completed a task for which it was constituted. Standards. The standards produced are titled British Standard XXXX[-P]:YYYY where XXXX is the number of the standard, P is the number of the part of the standard (where the standard is split into multiple parts) and YYYY is the year in which the standard came into effect. BSI Group currently has over 27,000 active standards. Products are commonly specified as meeting a particular British Standard, and in general, this can be done without any certification or independent testing. The standard simply provides a shorthand way of claiming that certain specifications are met, while encouraging manufacturers to adhere to a common method for such a specification. The Kitemark can be used to indicate certification by BSI, but only where a Kitemark scheme has been set up around a particular standard. It is mainly applicable to safety and quality management standards. There is a common misunderstanding that Kitemarks are necessary to prove compliance with any BS standard, but in general, it is neither desirable nor possible that every standard be 'policed' in this way. Following the move on harmonization of the standard in Europe, some British Standards are gradually being superseded or replaced by the relevant European Standards (EN). Status of standards. Standards are continuously reviewed and developed and are periodically allocated one or more of the following status keywords. PAS documents. BSI also publishes a series of Publicly Available Specification (PAS) documents. PAS documents are a flexible and rapid standards development model open to all organizations. A PAS is a sponsored piece of work allowing organizations flexibility in the rapid creation of a standard while also allowing for a greater degree of control over the document's development. A typical development time frame for a PAS is around six to nine months. Once published by BSI, a PAS has all the functionality of a British Standard for the purposes of creating schemes such as management systems and product benchmarks as well as codes of practice. A PAS is a living document and after two years the document will be reviewed and a decision made with the client as to whether or not this should be taken forward to become a formal standard. The term PAS was originally an abbreviation for "product approval specification", a name which was subsequently changed to "publicly available specification". However, according to BSI, not all PAS documents are structured as specifications and the term is now sufficiently well established not to require any further amplification. Availability. Copies of British Standards are sold at the BSI Online Shop or can be accessed via subscription to British Standards Online (BSOL). They can also be ordered via the publishing units of many other national standards bodies (ANSI, DIN, etc.) and from several specialized suppliers of technical specifications. British Standards, including European and international adoptions, are available in many university and public libraries that subscribe to the BSOL platform. Librarians and lecturers at UK-based subscribing universities have full access rights to the collection while students can copy/paste and print but not download a standard. Up to 10% of the content of a standard can be copy/pasted for personal or internal use and up to 5% of the collection made available as a paper or electronic reference collection at the subscribing university. Because of their reference material status standards are not available for interlibrary loan. Public library users in the UK may have access to BSOL on a view-only basis if their library service subscribes to the BSOL platform. Users may also be able to access the collection remotely if they have a valid library card and the library offers secure access to its resources. The BSI Knowledge Centre in Chiswick, London can be contacted directly about viewing standards in their Members' Reading Room.
4776
19372301
https://en.wikipedia.org/wiki?curid=4776
Building society
A building society is a financial institution owned by its members as a mutual organization, which offers banking and related financial services, especially savings and mortgage lending. They exist in the United Kingdom, Australia and New Zealand, and formerly in Ireland and several Commonwealth countries, including South Africa as mutual banks. They are similar to credit unions, but rather than promoting thrift and offering unsecured and business loans, the purpose of a building society is to provide home mortgages to members. Borrowers and depositors are society members, setting policy and appointing directors on a one-member, one-vote basis. Building societies often provide other retail banking services, such as current accounts, credit cards and personal loans. The term "building society" first arose in the 19th century in Great Britain from cooperative savings groups. In the United Kingdom, building societies compete with banks for most consumer banking services, especially mortgage lending and savings accounts, and regulations permit up to half of their lending to be funded by debt to non-members, allowing societies to access wholesale bond and money markets to fund mortgages. The world's largest building society is Britain's Nationwide Building Society. In Australia, building societies also compete with retail banks and offer the full range of banking services to consumers. History in the United Kingdom. Building societies as an institution began in late-18th century Birmingham – a town which was undergoing rapid economic and physical expansion driven by a multiplicity of small metalworking firms, whose many highly skilled and prosperous owners readily invested in property. Many of the early building societies were based in taverns or coffeehouses, which had become the focus for a network of clubs and societies for co-operation and the exchange of ideas among Birmingham's highly active citizenry as part of the movement known as the Midlands Enlightenment. The first building society to be established was Ketley's Building Society, founded by Richard Ketley, the landlord of the "Golden Cross" inn, in 1775. Members of Ketley's society paid a monthly subscription to a central pool of funds which was used to finance the building of houses for members, which in turn acted as collateral to attract further funding to the society, enabling further construction. By 1781 three more societies had been established in Birmingham, with a fourth in the nearby town of Dudley; and 19 more formed in Birmingham between 1782 and 1795. The first outside the English Midlands was established in Leeds in 1785. Most of the original societies were fully "terminating", where they would be dissolved when all members had a house: the last of them, First Salisbury and District Perfect Thrift Building Society, was wound up in March 1980. In the 1830s and 1840s a new development took place with the "permanent building society", where the society continued on a rolling basis, continually taking in new members as earlier ones completed purchases, such as Leek Building Society. The main legislative framework for the building society was the Building Societies Act 1874 (37 & 38 Vict. c. 42), with subsequent amending legislation in 1894, 1939 (see Coney Hall), and 1960. In their heyday, there were hundreds of building societies: just about every town in the country had a building society named after that town. Over succeeding decades the number of societies has decreased, as various societies merged to form larger ones, often renaming in the process, and other societies opted for demutualisation followed by – in the great majority of cases – eventual takeover by a listed bank. Most of the existing larger building societies are the result of the mergers of many smaller societies. All building societies in the UK are members of the Building Societies Association. At the start of 2008, there were 59 building societies in the UK, with total assets exceeding £360 billion. The number of societies in the UK fell by four during 2008 due to a series of mergers brought about, to a large extent, by the consequences of the 2008 financial crisis. There were three further mergers in each of 2009 and 2010, a demutualisation and a merger in 2011, and four further mergers 2013–2018 which resulted in there being only one building society headquartered respectively in Scotland and Northern Ireland. Since then, the only merger has been in 2023, when the Manchester society merged with the Newcastle society. Demutualisation. In the 1980s, changes to British banking laws allowed building societies to offer banking services equivalent to normal banks. The management of a number of societies still felt that they were unable to compete with the banks, and a new Building Societies Act was passed in 1986 in response to their concerns. This permitted societies to 'demutualise'. If more than 75% of members voted in favour, the building society would then become a limited company like any other. Members' mutual rights were exchanged for shares in this new company. A number of the larger societies made such proposals to their members and all were accepted. Some listed on the London Stock Exchange, while others were acquired by larger financial groups. The process began with the demutualisation of the Abbey National Building Society in 1989. Then, from 1995 to late 1999, eight societies demutualised accounting for two-thirds of building societies assets as at 1994. Five of these societies became joint stock banks (plc), one merged with another and the other four were taken over by plcs (in two cases after the mutual had previously converted to a plc). As mentions, demutualisation moves succeeded immediately because neither Conservative nor Labour party UK governments created a framework which put obstacles in the way of demutualisation. Political acquiescence in demutualisation was clearest in the case of the position on 'carpetbaggers', that is those who joined societies by lodging minimum amounts of £100 or so in the hope of profiting from a distribution of surplus after demutualisation. The deregulating Building Societies Act 1986 contained an anti-carpetbagger provision in the form of a two-year rule. This prescribed a qualifying period of two years before savers could participate in a residual claim. But, before the 1989 Abbey National Building Society demutualisation, the courts found against the two-year rule after legal action brought by Abbey National itself to circumvent the intent of the legislators. After this the legislation did prevent a cash distribution to members of less than two years standing, but the same result was obtained by permitting the issue of 'free' shares in the acquiring plc, saleable for cash. The Thatcher Conservative government declined to introduce amending legislation to make good the defect in the 'two-year rule'. 1980s and 1990s. Building societies, like mutual life insurers, arose as people clubbed together to address a common need interest; in the case of the building societies, this was housing and members were originally both savers and borrowers. But it very quickly became clear that 'outsider' savers were needed whose motive was profit through interest on deposits. Thus permanent building societies quickly became mortgage banks and in such institutions there always existed a conflict of interest between borrowers and savers. It was the task of the movement to reconcile that conflict of interest so as to enable savers to conclude that their interests and those of borrowers were to some extent complementary rather than conflictive. Conflict of interest between savers and borrowers was never fully reconciled in the building societies but upon deregulation that reconciliation became something of a lost cause. The management of building societies apparently could expend considerable time and resources (which belonged the organisation) planning their effective capture—of as much of the assets as they could. If so, this is arguably insider dealing on a grand scale with the benefit of inside specialist knowledge of the business and resources of the firm not shared with outsiders like politicians and members (and, perhaps, regulators). Once the opportunity to claim was presented by management the savers in particular could be relied upon to seize it. There were sufficient hard-up borrowers to take the inducement offered them by management (in spite of few simple sums sufficing to demonstrate that they were probably going to end up effectively paying back the inducement). Management promoting demutualisation also thereby met managerial objectives because the end of mutuality brought joint stock company (plc) style remuneration committee pay standards and share options. Share options for management of converting societies appear to be a powerful factor in management calculation. refers to this in the following terms: Instead of deploying their margin advantage as a defence of mutuality, around 1980 building societies began setting mortgage rates with reference to market clearing levels. In sum they began behaving more like banks, seeking to maximise profit instead of the advantages of a mutual organisation. Thus, according to the Bank of England's : As also observe: draws a rather more direct and cynical conclusion: Some of these managements ended up in dispute with their own members. Of the first major conversion of the Abbey in 1989, observed: In the end, after a number of large demutualisations, and pressure from carpetbaggers moving from one building society to another to cream off the windfalls, most of the societies whose management wished to keep them mutual modified their rules of membership in the late 1990s. The method usually adopted were membership rules to ensure that anyone newly joining a society would, for the first few years, be unable to get any profit out of a demutualisation. With the chance of a quick profit removed, the wave of demutualisations came to an end in 2000. One academic study found that demutualised societies' pricing behaviour on deposits and mortgages was more favourable to shareholders than to customers, with the remaining mutual building societies offering consistently better rates. 2000s and 2010s. The Building Societies (Funding) and Mutual Societies (Transfers) Act 2007, known as the Butterfill Act, was passed in 2007 giving building societies greater powers to merge with other companies. These powers have been used by the Britannia in 2009 and Kent Reliance in 2011 leading to their demutualisation. Prior to 31 December 2010, deposits with building societies of up to £50,000 per individual, per institution, were normally protected by the Financial Services Compensation Scheme (FSCS), but Nationwide and Yorkshire building societies negotiated a temporary change to the terms of the FSCS to protect members of the societies they acquired in late 2008/early 2009. The amended terms allowed former members of multiple societies which merge into one to maintain multiple entitlements to FSCS protection until 30 September 2009 (later extended to 30 December 2010), so (for example) a member with £50,000 in each of Nationwide, Cheshire and Derbyshire at the time of the respective mergers would retain £150,000 of FSCS protection for their funds in the merged Nationwide. On 31 December 2010 the general FSCS limit for retail deposits was increased to £85,000 for banks and building societies and the transitional arrangements in respect of building society mergers came to an end. List of building societies. United Kingdom. Current. , there are 42 independent building societies, all of which are members of the Building Societies Association. Demutualised. Ten building societies of the United Kingdom demutualised between 1989 and 2000, either becoming a bank or being acquired by a larger bank. By 2008, every building society that floated on the stock market in the wave of demutualisations of the 1980s and 1990s had either been sold to a conventional bank, or been nationalised. No longer exist. The following is an incomplete list of building societies in the United Kingdom that no longer exist independently, since they either merged with or were taken over by other organisations. They may still have an active presence on the high street (or online) as a trading name or as a distinct brand. This is typically because brands will often build up specific reputations and attract certain clientele, and this can continue to be marketed successfully. Australia. In Australia, building societies evolved along British lines. Following the end of World War II, the terminating model was revived to fund returning servicemen's need for new houses. Hundreds were created with government seed capital, whereby the capital was returned to the government and the terminating societies retained the interest accumulated. Once all the seed funds were loaned, each terminating society could reapply for more seed capital to the point where they could re-lend their own funds and thus became a permanent society. Terminating loans were still available and used inside the permanent businesses by staff up until the 1980s because their existence was not widely known after the early 1960s. Because of strict regulations on banks, building societies flourished until the deregulation of the Australian financial industry in the 1980s. Eventually many of the smaller building societies disappeared, while some of the largest (such as Advance and St George) attained the status of banks. More recent conversions have included Heritage Bank which converted from building society to bank in 2011, Hume in 2014, while Wide Bay Building Society became Auswide Bank and IMB followed suit in 2015, and Greater Building Society became Greater Bank in 2016. Building societies converting to banks are no longer required to demutualise. A particular difference between Australian building societies and those elsewhere, is that Australian building societies are required to incorporate as limited companies. Current building societies are Eswatini. The Building Societies Act of 1962 allowed for the registration of building societies in Eswatini. For a long time the country only had one building society. A second was registered in late 2019. Ireland. The Republic of Ireland had around 40 building societies at the mid-20th century peak. Many of these were very small and, as the Irish commercial banks began to originate residential mortgages, the small building societies ceased to be competitive. Most merged or dissolved or, in the case of First Active plc, converted into conventional banks. The last remaining building societies, EBS Building Society and Irish Nationwide Building Society, demutualised and were transferred or acquired into Bank subsidiaries in 2011 following the effects of the Irish financial crisis. Leeds Building Society Ireland and Nationwide UK (Ireland) were Irish branches of building societies based in the United Kingdom; both have since ceased all Irish operations. Jamaica. In Jamaica, three building societies compete with commercial banks and credit unions for most consumer financial services: New Zealand. Regulation. In New Zealand, building societies are registered with the Registrar of Building Societies under the Building Societies Act 1965. Registration as a building society is merely a process of establishing the entity as a corporation. It is largely a formality, and easily achieved, as the capital requirement is minimal (20 members must be issued shares of not less than NZ$1,000 each, with a total minimum foundation share capital of NZ$200,000). As regards prudential supervision, a divide exists between building societies that operate in New Zealand, on the one hand, and those that (although formally registered in New Zealand) operate offshore: Building societies' registration details and filed documents are available in the Register of Building Societies held at the New Zealand Companies Office. Individual building societies. Over the years, a number of building societies were established. Some, including Countrywide Building Society and United Building Society, became banks in the 1980s and 1990s. Heartland Building Society (created in 2011 through a merger of Canterbury Building Society, Southern Cross Building Society, and two other financial institutions) became Heartland Bank on 17 December 2012. Remaining building societies include: Zimbabwe. In Zimbabwe, "Central Africa Building Society" (CABS) is the leading building society offering a diverse range of financial products and services that include transaction and savings accounts, mobile banking, mortgage loans, money market investments, term deposits and pay-roll loans. Similar organisations in other countries. In other countries there are mutual organisations similar to building societies: Operational differences from banks. Roll numbers. Because most building societies were not direct members of the UK clearing system, it was common for them to use a roll number to identify accounts rather than to allocate a six-digit sort-code and eight-digit account number to the BACS standards. More recently, building societies have tended to obtain sort-code and account number allocations within the clearing system, and hence the use of roll numbers has diminished. When using BACS, one needs to enter roll numbers for the reference field and the building society's generic sort code and account number would be entered in the standard BACS fields.
4777
5229428
https://en.wikipedia.org/wiki?curid=4777
Blue Steel (missile)
The Avro Blue Steel was a British air-launched, rocket-propelled nuclear armed standoff missile, built to arm the V bomber force. It allowed the bomber to launch the missile against its target while still outside the range of surface-to-air missiles (SAMs). The missile proceeded to the target at speeds up to Mach 3, and would trigger within 100 m of the pre-defined target point. Blue Steel entered service in 1963, by which point improved SAMs with longer range had greatly eroded the advantages of the design. A longer-range version, Blue Steel II, was considered, but cancelled in favour of the much longer-range GAM-87 Skybolt system from the US. When development of that system was cancelled in 1962, the V-bomber fleet was considered highly vulnerable. Blue Steel remained the primary British nuclear deterrent weapon until the Royal Navy started operating Polaris ballistic missiles from "Resolution"-class submarines. Development. Origins. During the early 1950s, the Soviet PVO-Strany interceptor aircraft and its associated ground controlled interception systems were steadily improving, making the approach to the V bomber's targets more difficult. An extensive study, "The Pattern of the Future Offensive", suggested that an attacking force of 100 aircraft would suffer 8% losses at night, and as much as 29% during the day due to the much greater number of day fighters in Soviet service. The paper concluded with a summary showing the expected losses over time as the Soviet forces improved, by 1957 it was expected that losses would be 10 to 20% at night, and then further erode. The UK intelligence services were also aware that the Soviets were planning to deploy an extensive surface-to-air missile (SAM) system around Moscow. The system had been developed to fend off a WWII-style attack by 1,000 low-speed aircraft, like those raids carried out by RAF Bomber Command. Known in the west as the SA-1 "Guild", the system was partially based on World War 2 research in the German "Wasserfall" program and its V-300 missiles had limited range around at best. Expected to be deployed in the late 1950s, approaching the target for attack with gravity bombs would become increasingly dangerous after that time. At the time, the Soviet defences were at their best between and , and the existing V-bombers had the required performance to fly over this altitude. But the introduction of the SAMs would make this increasingly unlikely to work as more and more of the missiles were installed. There were two general solutions to this problem. One was to fly higher and faster, but the leap in performance needed to overfly the missiles, with altitudes on the order of could not be met until the mid-1960s at the minimum. This would leave a gap between around 1961 and 1965 when the V-force could not be expected to successfully perform its mission. The conclusion of this paper, and other reports like it, was that the situation could be greatly improved through the combination of improved electronic countermeasures (ECM) to disrupt fighter operations, and a standoff missile to allow the bombers to turn back before approaching within the range of the SAMs. It was expected that this combination would allow the V force to remain effective until 1965 at least. Beyond that time a new supersonic bomber would be required, a requirement that would ultimately be taken up as the Avro 730. OR.1132. On 3 September 1954 the Air Staff issued Operational Requirement OR.1132 for the standoff weapon. This was essentially issued to the Royal Aircraft Establishment (RAE), whose Guided Weapon Department had been studying such a system for a while and had produced a series of "W weapons" of different layouts. These had been designed to address the problem that the Atomic Weapons Research Establishment (AWRE) could not guarantee the size of the warhead. Accordingly, the W weapons were designed to the maximum possible size that each of the bombers could carry, diameter for the Vickers Valiant and for Avro Vulcan and Handley Page Victor. The Ministry of Supply (MoS) did not feel that development should remain at the RAE, and had canvassed the bomber builders asking for comments on the RAE's concepts, considering both a shorter-range and longer-ranged weapon. Avro, having recently given the go-ahead on the 730, appeared cool on the long-range concept and was primarily interested in a short-range weapon of perhaps range, which would allow it to avoid the most heavily defended areas. Vickers Armstrong and Handley Page were much more interested, proposing longer-ranged weapons that would extend the life of their existing V bomber designs. Handley Page's concept rejected the MoS's shorter range and offered a design of using ramjet power. However, the Ministry noted that the inertial navigation system (INS) being developed by Elliott Automation would not be accurate enough to fly this distance and hit its targets within the desired circular error probable (CEP). Vickers, who had worked on the somewhat similar Blue Boar and Red Rapier projects would seem like the natural choice for the contract, and planned to have their weapon in operation two years earlier than the other teams. But the Ministry had a cold relationship with Vickers after their Red Dean efforts. Avro appears to have been selected due to a number of RAE personnel having been hired by the company to form their design department, the Weapons Research Division, including the Chief Engineer, R.H. Fransis. On 4 May 1955 the Ministry of Supply issued the G/Proj/6220/GW35A contract with Avro and assigned it the rainbow code "Blue Steel". Avro's concept. At the time, nuclear weapon design was still in its early stages, and the intended weapon for the missile was a boosted fission weapon known as "Green Bamboo". To achieve the desired 1 MT yield, the implosion had to be extremely symmetrical, and this required a 72-point explosive system that led to the weapon being in diameter and massing an estimated . This weapon was secret and only the diameter and mass were revealed to prospective entrants. It was the large size of the warhead, and the resulting diameter fuselage needed to carry it, that led Avro to the use of a rocket motor when most standoff weapons to that point were jet or ramjet powered. Avro simply could not find a place to put an air intake on the missile which did not result it in being too large in some other dimension, typically length. The engine, the Armstrong Siddeley Stentor Mark 101, had two chambers, one for cruising at about Mach 1, and a second larger one that was ignited close to the target to increase speed to Mach 3 for a final dash. At Mach 3, skin friction will heat the fuselage to around , which is higher than the temperature that will cause aluminium to become soft. This is the reason that most high-speed aircraft are generally limited to about Mach 2.4 or lower, flying above this speed risks damage to the airframe. Those aircraft that do operate above this speed have to use other materials, and for this role, Avro chose stainless steel. The difficulties encountered bending sheets of stainless steel led to initial designs for the weapon that were very "linear", with the fuselage made of a single sheet rolled into a cylinder and a second sheet rolled into a cone to form the nose area. Avro's initial response, in late 1955, outlined a four-stage development effort. In Stage 1, the weapon would be powered by a two-chamber rocket engine that would provide the required range at speeds up to Mach 2.5. The Stage 2 would use an improved engine and more fuel to provide range up to at speeds up to Mach 4.5. Stage 3 would weigh in at and include both disposible boosters and a drop-off fuel tank with range up to . For the future Avro 730, a new delta wing design of 18 to 26,000 lbs would offer range up to . Considering their submissions, the Ministry responded by ordering Avro to work only on the OR.1137 requirements, and ignore the long-term developments. This produced the first complete design, W.100. They proposed building the missile from AF.520 stainless steel, expecting to learn how to do so using the 730 contract to build testbeds. A series of smaller designs, W.101 through 104, would be made from a variety of materials and used to test the flight dynamics, autopilot and INS. As part of ongoing industry-wide research, the design underwent several evolutionary changes. The wings and canards became deltas and the experience of the Bristol 188 gave Avro confidence in construction of rounded surfaces using stainless steel and the design became much more aerodynamic with a boat tail and ogive nose. Delays. Avro began work proper in 1955, with the assigned Rainbow Code name of "Blue Steel" which it would keep in service. With Elliots working on the guidance system, Armstrong Siddeley would develop the liquid fuel engine. The design period was protracted, with various development problems exacerbated by the fact that designers lacked information on the actual size and weight of the proposed Green Bamboo, or its likely thermonuclear successor derived from the Granite series. The program ran into delays almost immediately. All of the systems - missile, navigation and motor - ran into problems and this resulted in the companies pointing fingers at each other. By the late 1950s it was clear the missiles would not be ready for their 1962 initial operational introduction, leading to significant criticism of Avro on the part of the Ministry of Supply and the Air Ministry. By this time it was clear that the Soviets were going to install an even more extensive missile system that originally believed, one that would cover the approach routes to the targets and even have sites on the Russian coastline, well outside the range of the missile. Avro proposed that Blue Steel would evolve over time, subsequent versions increasing speed (to Mach 4.5) and range. The ultimate Blue Steel would be a range weapon that could be launched by the supersonic Avro 730 under development. They were told to limit themselves to the specification of OR.1132. The project was delayed by the need to develop the required stainless steel fabrication techniques; this would have been gained in building the Avro 730 but that had been cancelled by then. The Elliots guidance system was plagued by accuracy problems, delaying test flights. As it turned out, neither of the originally-proposed UK-designed warheads were actually fitted, being superseded by Red Snow, an Anglicised variant of the U.S. W-28 thermonuclear warhead of 1.1 Mt yield. Red Snow was smaller and lighter than the earlier warhead proposals. The missile was fitted with a state-of-the-art inertial navigation unit. This system allowed the missile to strike within 100 metres of its designated target. In addition, the pilots of the Avro Vulcan or Handley Page Victor bombers could tie their systems into those of the missile and make use of the guidance system to help plot their own flight plan, since the unit in the missile was more advanced than that in the aircraft. On launch the rocket engine's first chamber developing thrust would power the missile along a predetermined course to the target at around Mach 1.5. Once close to the target, the second chamber of the engine (6,000 lbf) would accelerate the missile to Mach 3. Over the target the engine would cut out and the missile would free-fall before detonating its warhead as an air burst. To speed the trials at Woomera, the test rounds were flown there by Victors and Vulcans in Operation Blue Ranger. The trials began in 1960 about the time the original requirement expected the weapon to be in service. The missiles were prepared at the Weapons Research Establishment near Salisbury South Australia, and flown to be launched at the Woomera range from RAAF Edinburgh. A specialist unit, No. 4 Joint Services Trials Unit RAF, was established to carry out preparatory and operational tasks. Blue Steel finally entered service in February 1963, carried by Vulcans and Victors, although its limitations were already apparent. The short range of the missile meant that the V bombers were still vulnerable to enemy surface-to-air missiles. A replacement for Blue Steel, the Mark 2, was planned with increased range and a ramjet engine, but was cancelled in 1960 to minimise delays to the Mk.1. The UK sought to acquire the much longer-ranged United States AGM-48 Skybolt air-launched ballistic missile and was greatly frustrated when that weapon was cancelled in late 1962. Blue Steel required up to seven hours of launch preparation, and was highly unreliable. The Royal Air Force estimated in 1963 that half the missiles would fail to fire and would have to be dropped over their targets, contradicting their purpose of serving as standoff weapons. Even as it deployed Blue Steel as a high-altitude weapon, that year the government decided that because of anti-aircraft missiles' increasing effectiveness, V bombers would have to convert from high-altitude to low-altitude attacks. These trials were conducted in 1964 and concluded in 1965 With no effective long-range weapon the original Blue Steel served on after a crash programme of minor modifications to permit a low-level launch at , even though its usefulness in a hot war was likely limited. A stop-gap weapon (WE.177B) was quickly produced to extend the life of the V-bomber force in the strategic role until the Polaris missile was deployed. This WE.177 laydown weapon supplemented the remaining modified Blue Steel missiles, using a low-level penetration followed by a pop-up manoeuvre to release the weapon at . One live operational round was deployed on each of forty-eight Vulcan and Victor bombers, and a further five live rounds were produced as operational spares. An additional four non-nuclear rounds were produced for various RAF requirements, and there were sixteen other unspecified training rounds. Blue Steel was officially retired on 31 December 1970, with the United Kingdom's strategic nuclear capacity passing to the submarine fleet.
4778
7903804
https://en.wikipedia.org/wiki?curid=4778
Branch Davidians
The Branch Davidians (or the General Association of Branch Davidian Seventh-day Adventists, or the Branch Seventh-day Adventists) are a religious sect founded in 1955 by Benjamin Roden. They regard themselves as a continuation of the General Association of Davidian Seventh-Day Adventists, established by Victor Houteff in 1935. Houteff, a Seventh-day Adventist, wrote a series of tracts entitled the "Shepherd's Rod" that called for reform of the Seventh-day Adventist Church. After his ideas were rejected, Houteff and his followers formed the group that became known as "Davidians", and some moved onto land outside Waco, Texas. They built a community called the Mount Carmel Center, which served as headquarters for the movement. After Houteff's death in 1955, his wife Florence took control of the organization. That same year, Benjamin Roden, a follower of Houteff, proclaimed what he believed to be a new message from God and wrote letters presenting it to Davidians. In 1957, Florence sold the Mount Carmel Center and purchased near Elk, Texas – northeast of Waco – naming it New Mount Carmel Center. After the failure of Florence's prophecy of apocalyptic events on or near April 22, 1959, she dissolved the Davidian Association in 1962 and sold all but of the New Mount Carmel property. Benjamin Roden took possession of it in 1962 and began efforts to purchase the remaining . On February 27, 1973, New Mount Carmel was sold to Benjamin, his wife Lois Roden, and their son George Roden. From then on, the property was simply known as Mount Carmel. Upon the death of Benjamin Roden in 1978, Lois became the next Davidian prophet at the compound. In 1981, a young man named Vernon Howell, later known as David Koresh, came to Mount Carmel and studied biblical prophecy under Lois Roden. By 1983, Howell (Koresh) had gained a group of followers that separated from Lois' organization to form "The Davidian Branch Davidian Seventh Day Adventist Association". Meanwhile, Lois continued to operate the Branch Davidian Seventh Day Adventist Association from Mt. Carmel Center near Waco. Koresh's group and the Branch Davidians (Lois's group) were two separate organizations with different leaders, names, and locations from 1983. It was not until 1987, after Lois died, that Koresh filed a document claiming to be the president of the Branch Davidian Seventh Day Adventist Association. Koresh and followers, further, went to Mt. Carmel center, engaging in a shootout with George Roden that eventually resulted in Koresh's group occupying the land. These actions are regarded by Branch Davidians who remained loyal to Lois Roden as an act of identity theft against them. Koresh's leadership ended at the Waco siege of 1993, a fifty-one–day standoff between the sect and federal agents. Four agents of the U.S. Bureau of Alcohol, Tobacco, and Firearms (ATF) and two residents were killed by the sect during the initial raid, while four sect members were killed by ATF agents on February 28, 1993. Seventy-six members of Koresh's group, including many children, died in a fire that erupted during the siege on April 19, 1993. Early history. In 1929, Victor Houteff, a Bulgarian immigrant and a Seventh-day Adventist Sabbath School teacher from southern California, claimed that he had a new message for the entire Adventist church. He presented his views in a book, "The Shepherd's Rod: The 144,000 – A Call for Reformation". The Adventist leadership rejected Houteff's views as contrary to the church's basic teachings, and local church congregations disfellowshipped Houteff and his followers. In 1934, Houteff established his headquarters to the west of Waco, Texas, and his group became known as the Davidians. In 1942, he renamed the group the General Association of Davidian Seventh-day Adventists 'Davidian' which indicated its belief in the restoration of the Davidic Kingdom of Israel. Following Houteff's death in 1955, his wife Florence usurped the leadership believing herself to be a prophet. Convinced that an apocalypse would occur in 1959, a date which is not found in her husband's original writings, Florence and her council gathered hundreds of their faithful followers at the Mount Carmel Center, the group's compound which was located near Waco, for the fulfillment of the prophecy which is written in Ezekiel 9. The anticipated events did not occur, and following this disappointment, Benjamin Roden formed another group which he called the Branch Davidians and succeeded in taking control of Mount Carmel. The name of this group is an allusion to the anointed 'Branch' (mentioned in Zechariah 3:8; 6:12). When Benjamin Roden died in 1978, he was succeeded by his wife Lois Roden. Members of the Branch Davidians were torn between allegiance to Ben's wife or to his son, George. After Lois died, George assumed the right to the Presidency. However, less than a year later, Vernon Howell rose to power and became the leader over those in the group who sympathized with him. Rise of David Koresh. Howell's arrival at Mount Carmel in 1981 was well received by nearly everyone at the Davidian commune. He had engaged in an affair with Lois Roden while he was in his early 20s and she was in her late 60s. Howell wanted to father a child with her, who, according to his understanding, would be the Chosen One. When she died, George Roden inherited the positions of prophet and leader of the sect. A power struggle ensued between Roden and Howell, who soon gained the loyalty of the majority of the Davidians. In 1984, Howell and his followers left Mount Carmel after Roden accused Howell of having started a fire that consumed a $500,000 administration building and press, which Roden subsequently renamed "Rodenville". Another splinter group, led by Charlie Pace, left and settled in Alabama. As an attempt to regain support, Roden challenged Howell to raise the dead, going so far as to exhume the corpse of a two-decades–deceased Davidian in order to demonstrate his spiritual supremacy. (Roden denied this, saying he had only been moving the community cemetery.) This illegal act gave Howell an opportunity to attempt to file charges against Roden, but he was told that he needed evidence in order to substantiate the charges. On November 3, 1987, Howell and seven of his followers raided Mount Carmel, equipped with five .223-caliber semi-automatic rifles, two .22-caliber rifles, two 12-gauge shotguns, and nearly four hundred rounds of ammunition, in an apparent attempt to retake the compound. Although Howell's group claimed that it was trying to obtain evidence of Roden's illegal activities, its members did not take a camera with them. The trial ended with the jury finding Howell's followers not guilty, but the jury members were unable to agree on a verdict for Howell himself. After his followers were acquitted, Howell invited the prosecutors to Mount Carmel for ice cream. It is claimed that Howell was never authorized to name his breakaway sect the "Branch Davidians", and the church which bears that name continues to represent the members of the Branch church who did not follow him. As a spiritual leader. Howell, who acquired the position of spiritual leader from Roden, asserted it by changing his name to David Koresh, suggesting that he had ties to the biblical King David and Cyrus the Great (Koresh is the Hebrew version of the name Cyrus). He wanted to create a new lineage of world leaders. This practice later served as the basis for allegations that Koresh was committing child abuse, which contributed to the siege by the ATF. Interpreting Revelation 5:2, Koresh identified himself with the Lamb mentioned therein. This is traditionally believed to symbolize Jesus Christ; however, Koresh suggested that the Lamb would come before Jesus and pave the way for his Second Coming. By the time of the 1993 Waco siege, Koresh had encouraged his followers to think of themselves as "students of the Seven Seals," rather than as "Branch Davidians." During the standoff, one of his followers publicly announced that he wanted them to thereafter be identified by the name "Koreshians". Federal siege. On February 28, 1993, at 4:20 am, the Bureau of Alcohol, Tobacco, and Firearms attempted to execute a search warrant relating to alleged sexual abuse charges and illegal weapons violations. The ATF attempted to breach the compound for approximately two hours until their ammunition ran low. Four ATF agents (Steve Willis, Robert Williams, Todd McKeehan, and Conway Charles LeBleu) were killed and another sixteen agents were wounded during the raid. The five Branch Davidians killed in the 9:45 a.m. raid were Winston Blake (British), Peter Gent (Australian), Peter Hipsman, Perry Jones, and Jaydean Wendell; two were killed by the Branch Davidians. Almost six hours after the ceasefire, Michael Schroeder was shot dead by ATF agents who alleged he fired a pistol at agents as he attempted to re-enter the compound with Woodrow Kendrick and Norman Allison. His wife said he was merely returning from work and had not participated in the day's earlier altercation. After the raid, ATF agents established contact with Koresh and others inside of the compound. The FBI took command after the deaths of federal agents, and managed to facilitate the release of nineteen children (without their parents) relatively early into the negotiations. The children were then interviewed by the FBI and the Texas Rangers. On April 19, 1993, the FBI moved for a final siege of the compound using large weaponry such as .50 caliber (12.7 mm) rifles and armored combat engineering vehicles (CEV) to combat the heavily armed Branch Davidians. The FBI attempted to use tear gas to flush out the Branch Davidians. Officially, FBI agents were only permitted to return any incoming fire, not to actively assault the Branch Davidians. When several Branch Davidians opened fire, the FBI's response was to increase the amount of gas being used. Around noon, three fires broke out simultaneously in different parts of the building. The government maintains that the fires were deliberately started by Branch Davidians. Some Branch Davidian survivors maintain that the fires were started either accidentally or deliberately by the assault. Of the eighty-five Branch Davidians in the compound when the final siege began, seventy-six died on April 19 in various ways, from falling rubble to suffocating effects of the fire, or by gunshot from fellow Branch Davidians. The siege had lasted fifty-one days. Aftermath. In all, four ATF agents were killed, sixteen were wounded, and six Branch Davidians died in the initial raid on February 28. Seventy-six more died in the final assault on April 19. The events at Waco spurred criminal prosecution and civil litigation. A federal grand jury indicted twelve of the surviving Branch Davidians — including Clive Doyle, Brad Branch, Ruth Riddle, and Livingstone Fagan — charging them with aiding and abetting in the murder of federal officers, and unlawful possession and use of various firearms. Eight Branch Davidians were convicted on firearms charges, five convicted of voluntary manslaughter, and four were acquitted of all charges. As of July 2007, all Branch Davidians had been released from prison. Civil suits were brought against the United States government, federal officials, former governor of Texas Ann Richards, and members of the Texas Army National Guard. The bulk of these claims were dismissed because they were insufficient as a matter of law or because the plaintiffs could advance no material evidence in support of them. One case, "Andrade v. Chojnacki", made it to the Fifth Circuit, which upheld a previous ruling of "take-nothing, denied". After the siege. There are several groups that claim descent from the Branch Davidians today. The group that retains the original name "Branch Davidian Seventh Day Adventist" regards Lois Roden's immediate successor to have been Doug Mitchell (who joined the Branch Davidians in 1978 and led the group from 1986 until his death in 2013) and Mitchell's successor to be Trent Wilde (who has led the group since 2013). This group never followed David Koresh. Another group exists under the leadership of Charles Pace, called The Branch, The Lord Our Righteousness. It is a legally recognized denomination with twelve members. Pace, while regarding Koresh as appointed by God, says that Koresh "twisted the Bible's teachings by fathering more than a dozen children with members' wives". Pace believes that the Lord "has anointed me and appointed me to be the leader", but he says he is "not a prophet" but "a teacher of righteousness". Others, once led by Clive Doyle, continue to believe Koresh was a prophet and await his resurrection, along with the followers who were killed. Both of these groups are still waiting for the end times. Doyle died in June 2022. Flag. The original Davidian flag, as described in the "Catalog-Syllabus" (1942), showed a red lion inside a black six-pointed star, surrounded by five-pointed stars, on a green field. On February 28, 1993, the day of their initial confrontation with the ATF, a similar flag made of fringed satin was visible hanging in a front window of the compound. This flag had a pink six-pointed star surrounded by five-pointed stars, on a gold field. It had been presented to Benjamin Roden by northern Branch Davidians, but Roden never flew it because "the colors were wrong". Koresh, however, decided to hang it, and although it was destroyed on the first day it continued to hang in tatters throughout the siege. The Waco sect hoisted a different flag on their flagpole on March 1, 1993, after being surrounded by federal agents. This flag was also made of satin and was sewn in part by Kathy Jones. Steve Schneider described it as containing a "stylistic Star of David" and the fiery flying serpent mentioned in Isaiah 14. According to Koresh, the flag signified Mount Carmel's status as a sovereign power. This flag ripped off its pole during the Mount Carmel compound fire, and the ATF replaced it with an American flag, a Texas flag, and their own flag. Carol Howe reported that a Branch Davidian flag was hanging in Elohim City as of January 1995, before the Oklahoma City bombing. Relationship with Seventh-Day Adventists. The Seventh-day Adventist Church, the main church in the Adventist tradition, rejected Victor Houteff's teachings and revoked his membership in 1930. Houteff then went on to found the Davidian Seventh Day Adventist Association (an offshoot which is also known as the Shepherd's Rod). The Branch Davidians are an offshoot of the Davidians and they are also a product of a schism which was initiated by Benjamin Roden, after Houteff's death and in light of Florence's (Houteff's wife) usurpation of power. Florence believed that she was a prophet. But her prediction of the demise of the Seventh Day Adventist Church, which according to her should have occurred forty-two months after Houteff's death (1959) failed to materialize. Likewise, Ben Roden believed that he was a prophet as well as a rightful heir to the leadership of the Davidians. While they were still formally members of the Seventh-day Adventist Church, the Branch Davidian leaders demanded a reform of the church and when their demand was met with opposition (by both the Seventh-day Adventists and the Davidians), they decided to leave that denomination and at the same time, they widely distanced themselves from the Davidians. The Seventh-day Adventist Church deprived both the Branch Davidians and the Davidians of their membership in the denomination, in spite of this fact, the Branch Davidians actively continued to "hunt" members of the Seventh-day Adventist Church, and encourage them to leave it and join their group. The Seventh-day Adventists were reportedly "apprehensive" about the group's views because Branch Davidians claimed that they were the "only rightful continuation of the Adventist message", based on their belief that Victor Houteff was the divinely selected prophet and the successor of Ellen G. White. Both the Davidians and the Branch Davidians claimed that Houteff was their spiritual inspiration, as the founder of the Davidians. The Seventh-day Adventist Church issued warnings about the Branch Davidian sect's views to its members on a regular basis. Schisms within the Branch Davidian sect. There is documented evidence (FBI negotiation transcripts, during the standoff, with Kathryn Shroeder and Steve Schneider with interjections from Koresh himself) that David Koresh and his followers did not call themselves Branch Davidians. In addition, David Koresh, through forgery, stole the identity of the Branch Davidian Seventh-day Adventists for the purpose of obtaining the New Mount Carmel Center's property.
4779
42065516
https://en.wikipedia.org/wiki?curid=4779
Burwash Hall
Burwash Hall refers to both Burwash Dining Hall and Burwash Hall proper, the second oldest of the residence buildings at Victoria University in Toronto, Canada. Construction began in 1911 and was completed in 1913. It was named after Nathanael Burwash, president of Victoria from 1887 to 1912. The building is an extravagant Neo-Gothic work with turrets, gargoyles, and battlements. The architects were Messrs. Sproatt & Rolph. History. In 1910, seven years after the opening of the women's residence, Annesley Hall, the administrators of Victoria University began plans for an elaborate men's residence building on the campus. The project was funded by the estate of Mr. Hart A. Massey, an alumnus from Victoria's early days in Cobourg. The full cost of the project is unknown to the public, but the asset was valued at $450,000 in 1913.Famous residents of Burwash include Vincent Massey, Lester B. Pearson, Don Harron, and Donald Sutherland. Architecture. The building is described as Collegiate Gothic, intended to include all the developments of the Tudor style. Burwash Hall was designed to resemble the residences in Oxford and Cambridge, with modifications to the stair-case system and the division of houses. Sproatt & Rolph avoided architecture of commercial appearance, envisioning a structure that was academic in feeling. Burwash Hall was not intended to replicate Oxford buildings, but to "prove that academic Gothic can be indigenous in Canada as well as in England, and that it can be perfectly adapted to the exigencies of [Canada's] climate and life". Constructed of Bedford Indiana cut stone and Georgetown rubble masonry, the residence houses and adjoined dining hall intended to prove that beauty and efficiency were not antithetical. Design. The building is divided between the large dining hall in the northwest and the student residence proper. The residence area is divided into two sections: the Upper Houses, built in 1913, and the Lower Houses, built in 1931. To the west the Upper Houses look out on the Vic Quad and the main Victoria College building across it. West of the Lower Houses is the new Lester B. Pearson Garden of Peace and International Understanding and the E.J. Pratt Library beyond it. From the eastern side of the building, the Upper Houses look out at Rowell Jackman Hall and the Lower Houses see the St. Michael's College residence of Elmsley. The only exception is the view from Gate House Tower which looks down St. Mary's Street. Burwash Dining Hall. The dining hall is perhaps the best known part of the building to outsiders. It is the University of Toronto's largest, holding some 250 students and sixteen large tables. Hanging on the western wall is Queen Victoria's burial flag, given to the college soon after her death. Under the flag is the high table where the professors and college administration lunch. Historically, the Upper Houses each had their own table. Gate sat in the southwest corner, Middle sat in the far northeast, South sat in the table to the west of Middle, while North sat to the west of the southeast corner. The only lower house to have had a designated table was Caven, in the northwest corner beside the alumni table. (Note that prior to the 1995 renovations, some of these houses, particularly North and Caven, 'traditionally' sat elsewhere). Upper Houses. Adjoined to Burwash Dining Hall and completed in 1913, the upper houses were originally known as the "Men's Residences." The four houses are: North House, Middle House, Gate House, and South House. The upper houses were gutted and renovated in 1995. Each Upper House consists of three floors. The lower floor contains a common room equipped with kitchen facilities, couches and a television. The upper floors each have their own kitchen and dining area. All houses have a very high bathroom ratio, with many single-use washrooms and a communal washroom on each floor. Upper Houses are divided between double rooms and singles, with about sixty percent of the population being in doubles. Unlike the Lower Houses, the hallways of each Upper House are connected, a feature that was added in the 1995 renovations. There is typically an upper-year Residence Don for each of the Upper Houses. This has been the case since the building's construction in 1913, when specific rooms were included for Dons. The original location of these rooms has since changed with recent renovations. North House. At the corner of Burwash Dining Hall and the Upper Houses lies North House. The emblem of North House is an oil lamp, originally used to represent Victoria's Faculty of Theology, established in 1871. North has an extended hallway and larger common room than the other Upper Houses due to its position on the corner of the building. Middle House. The largest of the Upper Houses due to its incorporation of two battlements which divide North and Gate House, Middle House is the centre of the Upper Houses. The owl is the emblem of Middle House, sourced from the Victoria College coat of arms, and originally used to represent the Faculty of Arts. The Don's room in Middle House was originally the room reserved for the Dean of Men at Victoria, fit with a fireplace and seating area. Gate House. Gate House is one of the four Upper Houses of the Burwash Hall residence. Until 2007, when Victoria administration made it co-ed, Gate House was one of the last remaining all-male residence building in the University of Toronto. The Gate House emblem is the Phoenix, visible in the bottom-right corner of the Victoria College insignia. Gate House, with the rest of Upper Burwash, opened in 1913 and has held students every year since then except 1995, when it was renovated. As an all-male residence from 1913 to 2007 it held a number of unique traditions. For 20 years Gate House hosted an annual party called Novemberfest in the Burwash dining hall. The Victoria Dean of Students cancelled Novemberfest in 2003, when police discovered widespread underage drinking and over 800 people in the dining hall, in violation of the fire code. Another Gate House tradition that no longer occurs is the "stirring the chicken," a dinner and keg party where house members cook chicken fajitas for hundreds of guests. Until 2007, Gate House held secretive first-year initiation ceremonies called Traditionals, which involved writing slogans on campus buildings in chalk, singing songs to the all-women's residence (who would then sing back to them), and leading first-years around the house blindfolded. Since Novemberfest, Gate House continued to have conflict with the Administration. In 2004 the Dean evicted three Gate House residents for allegedly "hog-tying" a first-year student. In 2007 President Paul W. Gooch wrote that Gate House undertook an "escalating series of actions" that were "defiant" and "disparaging of women", in response to Gate members constructing a 2.5-metre snow penis and placing a cooked pig's head in an Annesley bathroom. As punishment, during the fall exam period Gooch evicted two residents and relocated the remainder of Gate House to other places in the residence system, banned all current Gate House students from entering the building in 2008. Since this decision Gate House has become a co-ed residence identical to the other Upper Burwash houses. Notable residents of Gate House include Lester B. Pearson, former Prime Minister of Canada, and Simon Pulsifer, who "Time" magazine nicknamed "The Duke of Data" for his contributions to Wikipedia. During its 93 years as a men's residence, Gate House developed a distinct character and reputation. These antics included pranks, toga parties, streaking, caroling to other residences, hazing rituals, "beer bashes" and "incessant pounding" on the Gate House table in the dining hall. Paul Gooch wrote that these traditions gave Gate House an "ethos" that contradicted his vision of residence life. The all-male Gate House was known as a social centre and spirited, tight-knit community. According to Grayson Lee, who created the snow penis sculpture in 2007, most of its residents were "heartbroken" to leave. Former Gate House President Dave Ruhl commented that "the Gate House camaraderie is unique" and that living there was "one of the most important parts of the university experience" for many. The Reuters news agency nicknamed Gate House "U of T's Animal House" because Donald Sutherland's memories of its parties are said to have influenced the script of the 1978 movie. The Toronto Star described Gooch's decision to put an end to its traditions, activities and distinguishing characteristics as "neutering Animal House." Gate House has three floors which house 28 students, as well as a don and the Victoria College Residence Life Co-ordinator. Above the gate there is a tower that rises three stories higher and has a turret-style roof. The first floor has one double room and one bathroom available to students. About half of the floor is taken up by the apartment of the Residence Life Coordinator. Lastly, on the first floor there is a house common room with a kitchen and two couches. The second floor has three double rooms and seven single rooms. It has three single washrooms and one larger communal one, as well as its own kitchen. This floor is home to the residence don, who has a larger room with a private washroom. The third floor is identical to the second except that in place of the don's room there are two single rooms. South House. Adjoining St. Mary's Arch and Lower Houses, South House is the furthest Upper House from Burwash Dining Hall. The emblem of the house is a sphinx, which was originally used to represent Victoria's Faculty of Law, established in 1860. Lower Houses. The Lower Houses, originally the "Emmanuel College Residences," were intended to house theology students at Emmanuel College, whose current building was opened the same year. Ryerson House (renamed to First House in 2021), Nelles House, Caven House, Bowles-Gandier House are now mostly home to undergraduate arts and science students. One story lower than the Upper Houses, they consist of four floors in order to reach the same height. All five houses are connected underground via the basement. The lower houses have only been partially upgraded. Before the renovations the entire building was exclusively male, but now every house is co-ed. The Lower Houses each have four floors, but are much narrower with each level having only four rooms. Each level also has its own kitchen, but these are much smaller than in the Upper Houses. The Lower Houses do have far larger and better fitted common rooms that are similar to the ones the Upper Houses had before the renovations. The rooms in the Lower Houses are also considered more luxurious with hardwood floors and large sizes. Rooms in the Lower Houses are more expensive, however. Until 2003 the Lower Houses were restricted to upper year students but with the double cohort of graduates from Ontario schools many of the rooms were transformed into doubles and now hold first years. There is typically one upper-year Residence Don for First, Nelles, and Caven House, and a second upper-year Residence Don for Bowles-Gandier House. First House (Ryerson House). First House connects the Upper and Lower Houses, and was originally named after the first principal of Victoria College, Egerton Ryerson. The house was renamed for the 2021–2022 school year, after an investigation into the legacy of Ryerson and his role in the Canadian residential school system. The Victoria University Students' Administrative Council initially called for a renaming in February 2019, which was followed by the Victoria University Research Panel on the Legacy of Egerton Ryerson in 2020, and ultimately the Presidential Report on the Legacy of Egerton Ryerson in 2021 which resulted in the change. Although the house had been named "Ryerson" since 1933, all five Lower Houses were temporarily named First House, Second House, Third House, Fourth House, and Fifth House from 1931 until 1933. The Board of Regents used these placeholder names until they decided upon better alternatives. In 2021, the name of Ryerson House was reverted to its original title, First House. Nelles House. Named for Victoria principal, president, and chancellor Samuel S. Nelles, Nelles House echoes the position of Middle House amongst the Upper Houses. Like Middle, Nelles House sits between the first and third houses in the set. However, Nelles is the exact same size and shape as Ryerson and Caven House. Caven House. The only Lower House to not be named after a Victoria College student or administrator, Caven House comes from presbyterian minister William Caven, the second principal of Knox College, Toronto. Although he was not directly involved with the affairs of Victoria, he consulted President Nathanael Burwash and Rev. Alfred Gandier about the division of Arts and Theology at Victoria. After great discussion, this spawned Emmanuel College, Victoria's Faculty of Theology. Since the Lower Houses were originally built for Emmanuel students, it is likely that this is the reason for naming the House after William Caven. Bowles-Gandier House. Name. Bowles House is named after Victoria president and chancellor Richard Pinch Bowles, the first-cousin once-removed of Lester B. Pearson. Gandier House is named after Alfred Gandier, the first principal of Emmanuel College. Exterior Design. The last house in Burwash Hall has a unique style to the other three Lower Houses. Bowles-Gandier House, colloquially "BG", juts out on the West side of the Southern edge of Burwash Hall. Enclosing the residence structure along the property line where Victoria meets St. Michael's College, the house forms an L shape with the other three Lower Houses. Interior Layout. Unlike Ryerson, Nelles, and Caven House—which are all identical in layout—Bowles-Gandier is an amalgamation of Gandier House and Bowles House, which operated independently until the 1995 renovations. The floor plan for Bowles-Gandier varies from the other Lower Houses, with a significantly larger common room and two triple-bedrooms on the main floor: each with an in-suite kitchen and bathroom. The Bowles and Gandier stairwells are connected by a hallway on the second, third, and fourth floors. Each of these floors has a shared kitchen, and two bedrooms at the end of each hall with a connected bathroom, much like the other Lower Houses. They also have two double-rooms in the middle of the hall, which have a connected bathroom and kitchen, meaning BG has the highest kitchen-to-student ratio in Burwash. Bowles-Gandier is sometimes referred to as the Vic One House, since it is mostly reserved for students in the Vic One program. While not a requirement to live in BG, it has historically consisted of many Vic One students.
4781
7903804
https://en.wikipedia.org/wiki?curid=4781
Benzodiazepine
Benzodiazepines (BZD, BDZ, BZs), colloquially known as "benzos", are a class of central nervous system (CNS) depressant drugs whose core chemical structure is the fusion of a benzene ring and a diazepine ring. They are prescribed to treat conditions such as anxiety disorders, insomnia, and seizures. The first benzodiazepine, chlordiazepoxide (Librium), was discovered accidentally by Leo Sternbach in 1955, and was made available in 1960 by Hoffmann–La Roche, which followed with the development of diazepam (Valium) three years later, in 1963. By 1977, benzodiazepines were the most prescribed medications globally; the introduction of selective serotonin reuptake inhibitors (SSRIs), among other factors, decreased rates of prescription, but they remain frequently used worldwide. Benzodiazepines are depressants that enhance the effect of the neurotransmitter gamma-aminobutyric acid (GABA) at the GABAA receptor, resulting in sedative, hypnotic (sleep-inducing), anxiolytic (anti-anxiety), anticonvulsant, and muscle relaxant properties. High doses of many shorter-acting benzodiazepines may also cause anterograde amnesia and dissociation. These properties make benzodiazepines useful in treating anxiety, panic disorder, insomnia, agitation, seizures, muscle spasms, alcohol withdrawal and as a premedication for medical or dental procedures. Benzodiazepines are categorized as short, intermediate, or long-acting. Short- and intermediate-acting benzodiazepines are preferred for the treatment of insomnia; longer-acting benzodiazepines are recommended for the treatment of anxiety. Benzodiazepines are generally viewed as safe and effective for short-term use of two to four weeks, although cognitive impairment and paradoxical effects such as aggression or behavioral disinhibition can occur. According to the Government of Victoria's (Australia) Department of Health, long-term use can cause "impaired thinking or memory loss, anxiety and depression, irritability, paranoia, aggression, etc." A minority of people have paradoxical reactions after taking benzodiazepines such as worsened agitation or panic. Benzodiazepines are often prescribed for as-needed use, which is under-studied, but probably safe and effective to the extent that it involves intermittent short-term use. Benzodiazepines are associated with an increased risk of suicide due to aggression, impulsivity, and negative withdrawal effects. Long-term use is controversial because of concerns about decreasing effectiveness, physical dependence, benzodiazepine withdrawal syndrome, and an increased risk of dementia and cancer. The elderly are at an increased risk of both short- and long-term adverse effects, and as a result, all benzodiazepines are listed in the Beers List of inappropriate medications for older adults. There is controversy concerning the safety of benzodiazepines in pregnancy. While they are not major teratogens, uncertainty remains as to whether they cause cleft palate in a small number of babies and whether neurobehavioural effects occur as a result of prenatal exposure; they are known to cause withdrawal symptoms in the newborn. In an overdose, benzodiazepines can cause dangerous deep unconsciousness, but are less toxic than their predecessors, the barbiturates, and death rarely results when a benzodiazepine is the only drug taken. Combined with other central nervous system (CNS) depressants such as alcohol and opioids, the potential for toxicity and fatal overdose increases significantly. Benzodiazepines are commonly used recreationally and also often taken in combination with other addictive substances, and are controlled in most countries. Medical uses. Benzodiazepines possess psycholeptic, sedative, hypnotic, anxiolytic, anticonvulsant, muscle relaxant, and amnesic actions, which are useful in a variety of indications such as alcohol dependence, seizures, anxiety disorders, panic, agitation, and insomnia. Most are administered orally; however, they can also be given intravenously, intramuscularly, or rectally. In general, benzodiazepines are well tolerated and are safe and effective drugs in the short term for a wide range of conditions. Tolerance can develop to their effects and there is also a risk of dependence, and upon discontinuation a withdrawal syndrome may occur. These factors, combined with other possible secondary effects after prolonged use, such as psychomotor, cognitive, or memory impairments, limit their long-term applicability. The effects of long-term use or misuse include the tendency to cause or worsen cognitive deficits, depression, and anxiety. The College of Physicians and Surgeons of British Columbia recommends discontinuing the usage of benzodiazepines in those on opioids and those who have used them long term. Benzodiazepines can have serious adverse health outcomes, and these findings support clinical and regulatory efforts to reduce usage, especially in combination with non-benzodiazepine receptor agonists. Panic disorder. Benzodiazepines are usually administered orally; however, very occasionally lorazepam or diazepam may be given intravenously for the treatment of panic attacks. Because of their effectiveness, tolerability, and rapid onset of anxiolytic action, benzodiazepines are frequently used for the treatment of anxiety associated with panic disorder. However, there is disagreement among expert bodies regarding the long-term use of benzodiazepines for panic disorder. The views range from those holding benzodiazepines are not effective long-term and should be reserved for treatment-resistant cases to those holding they are as effective in the long term as selective serotonin reuptake inhibitors (SSRIs). American Psychiatric Association (APA) guidelines, published in January 2009, note that, in general, benzodiazepines are well tolerated, and their use for the initial treatment for panic disorder is strongly supported by numerous controlled trials. APA states that there is insufficient evidence to recommend any of the established panic disorder treatments over another. The choice of treatment between benzodiazepines, SSRIs, serotonin–norepinephrine reuptake inhibitors (SNRIs), tricyclic antidepressants, and psychotherapy should be based on the patient's history, preference, and other individual characteristics. Selective serotonin reuptake inhibitors are likely to be the best choice of pharmacotherapy for many patients with panic disorder, but benzodiazepines are also often used, and some studies suggest that these medications are still used with greater frequency than the SSRIs. One advantage of benzodiazepines is that they alleviate anxiety symptoms much more quickly than antidepressants, and therefore may be preferred in patients for whom rapid symptom control is critical. However, this advantage is offset by the possibility of developing benzodiazepine dependence. APA does not recommend benzodiazepines for persons with depressive symptoms or a recent history of substance use disorder. APA guidelines state that, in general, pharmacotherapy of panic disorder should be continued for at least a year, and that clinical experience supports continuing benzodiazepine treatment to prevent recurrence. Although major concerns about benzodiazepine tolerance and withdrawal have been raised, there is no evidence for significant dose escalation in patients using benzodiazepines long-term. For many such patients, stable doses of benzodiazepines retain their efficacy over several years. Guidelines issued by the UK-based National Institute for Health and Clinical Excellence (NICE) carried out a systematic review using a different methodology and came to a different conclusion. They questioned the accuracy of studies that were not placebo-controlled. And, based on the findings of placebo-controlled studies, they do not recommend use of benzodiazepines beyond two to four weeks, as tolerance and physical dependence develop rapidly, with withdrawal symptoms including rebound anxiety occurring after six weeks or more of use. Nevertheless, benzodiazepines are still prescribed for long-term treatment of anxiety disorders, although specific antidepressants and psychological therapies are recommended as the first-line treatment options with the anticonvulsant drug pregabalin indicated as a second- or third-line treatment and suitable for long-term use. NICE stated that long-term use of benzodiazepines for panic disorder with or without agoraphobia is an unlicensed indication, does not have long-term efficacy, and is, therefore, not recommended by clinical guidelines. Psychological therapies such as cognitive behavioural therapy are recommended as a first-line therapy for panic disorder; benzodiazepine use has been found to interfere with therapeutic gains from these therapies. Generalized anxiety disorder. Benzodiazepines have robust efficacy in the short-term management of generalized anxiety disorder (GAD) when standardized measures of anxiety are used as the outcome variable, but did not demonstrate a favorable dropout rate compared to placebo in one meta-analysis. A newer meta-analysis showed that benzodiazepines are significantly more effective than serotonergic agents, regardless of treatment length. More research is needed, but unfortunately, newer randomized controlled trials are scarce for the off patent benzodiazepines. According to National Institute for Health and Clinical Excellence (NICE), benzodiazepines can be used in the immediate management of GAD, if necessary. However, they should not usually be given for longer than 2–4 weeks. The only medications NICE recommends for the longer-term management of GAD are antidepressants. Likewise, Canadian Psychiatric Association (CPA) recommends benzodiazepines alprazolam, bromazepam, lorazepam, and diazepam only as a second-line choice, if the treatment with two different antidepressants was unsuccessful. Although they are second-line agents, benzodiazepines can be used for a limited time to relieve severe anxiety and agitation. CPA guidelines state that after 4–6 weeks the effect of benzodiazepines may decrease to the level of placebo, and that benzodiazepines are less effective than antidepressants in alleviating ruminative worry, the core symptom of GAD, but that in some cases, a prolonged treatment with benzodiazepines as the add-on to an antidepressant may be justified. Evidence from reviews of benzodiazepine tolerance mechanisms and clonazepam use in psychiatric disorders is strongly discordant with the claim that benzodiazepines lose anxiolytic efficacy over weeks, as these reviews present RCT evidence of continued anxiolytic efficacy at up to 22 weeks and observational (open-label) evidence of continued efficacy at up to 3 years. A 2015 review found a larger effect with medications than with talk therapy. Medications with benefit include serotonin-noradrenaline reuptake inhibitors, benzodiazepines, and selective serotonin reuptake inhibitors. Insomnia. Benzodiazepines can be useful for short-term treatment of insomnia. Their use beyond 2 to 4 weeks is not recommended due to the risk of dependence. The Committee on Safety of Medicines report recommended that, where long-term use of benzodiazepines for insomnia is indicated, treatment should be intermittent wherever possible. It is preferred that benzodiazepines be taken intermittently and at the lowest effective dose. They improve sleep-related problems by shortening the time spent in bed before falling asleep, prolonging the sleep time, and, in general, reducing wakefulness. However, they worsen sleep quality by increasing light sleep and decreasing deep sleep. Other drawbacks of hypnotics, including benzodiazepines, are possible tolerance to their effects, rebound insomnia, and reduced slow-wave sleep and a withdrawal period typified by rebound insomnia and a prolonged period of anxiety and agitation. The list of benzodiazepines approved for the treatment of insomnia is fairly similar among most countries, but which benzodiazepines are officially designated as first-line hypnotics prescribed for the treatment of insomnia varies between countries. Longer-acting benzodiazepines such as nitrazepam and diazepam have residual effects that may persist into the next day and are, in general, not recommended. Since the release of nonbenzodiazepines, also known as z-drugs, in 1992 in response to safety concerns, individuals with insomnia and other sleep disorders have increasingly been prescribed nonbenzodiazepines (2.3% in 1993 to 13.7% of Americans in 2010), less often prescribed benzodiazepines (23.5% in 1993 to 10.8% in 2010). It is not clear as to whether the new non benzodiazepine hypnotics (Z-drugs) are better than the short-acting benzodiazepines. The efficacy of these two groups of medications is similar. According to the US Agency for Healthcare Research and Quality, indirect comparison indicates that side-effects from benzodiazepines may be about twice as frequent as from nonbenzodiazepines. Some experts suggest using nonbenzodiazepines preferentially as a first-line long-term treatment of insomnia. However, the UK National Institute for Health and Clinical Excellence did not find any convincing evidence in favor of Z-drugs. NICE review pointed out that short-acting Z-drugs were inappropriately compared in clinical trials with long-acting benzodiazepines. There have been no trials comparing short-acting Z-drugs with appropriate doses of short-acting benzodiazepines. Based on this, NICE recommended choosing the hypnotic based on cost and the patient's preference. Older adults should not use benzodiazepines to treat insomnia unless other treatments have failed. When benzodiazepines are used, patients, their caretakers, and their physician should discuss the increased risk of harms, including evidence that shows twice the incidence of traffic collisions among driving patients, and falls and hip fracture for older patients. Seizures. Prolonged convulsive epileptic seizures are a medical emergency that can usually be dealt with effectively by administering fast-acting benzodiazepines, which are potent anticonvulsants. In a hospital environment, intravenous clonazepam, lorazepam, and diazepam are first-line choices. In the community, intravenous administration is not practical and so rectal diazepam or buccal midazolam are used, with a preference for midazolam as its administration is easier and more socially acceptable. When benzodiazepines were first introduced, they were enthusiastically adopted for treating all forms of epilepsy. However, drowsiness and tolerance become problems with continued use, and none are now considered first-line choices for long-term epilepsy therapy. Clobazam is widely used by specialist epilepsy clinics worldwide and clonazepam is popular in the Netherlands, Belgium and France. Clobazam was approved for use in the United States in 2011. In the UK, both clobazam and clonazepam are second-line choices for treating many forms of epilepsy. Clobazam also has a useful role for very short-term seizure prophylaxis and in catamenial epilepsy. Discontinuation after long-term use in epilepsy requires additional caution because of the risks of rebound seizures. Therefore, the dose is slowly tapered for up to six months or longer. Alcohol withdrawal. Chlordiazepoxide is the most commonly used benzodiazepine for alcohol detoxification, but diazepam may be used as an alternative. Both are used in the detoxification of individuals who are motivated to stop drinking, and are prescribed for a short period to reduce the risks of developing tolerance and dependence to the benzodiazepine medication itself. The benzodiazepines with a longer half-life make detoxification more tolerable, and dangerous (and potentially lethal) alcohol withdrawal effects are less likely to occur. On the other hand, short-acting benzodiazepines may lead to breakthrough seizures, and are, therefore, not recommended for detoxification in an outpatient setting. Oxazepam and lorazepam are often used in patients at risk of drug accumulation, in particular, the elderly and those with cirrhosis, because they are metabolized differently from other benzodiazepines, through conjugation. Benzodiazepines are the preferred choice in the management of alcohol withdrawal syndrome, in particular, for the prevention and treatment of the dangerous complication of seizures and in subduing severe delirium. Lorazepam is the only benzodiazepine with predictable intramuscular absorption and it is the most effective in preventing and controlling acute seizures. Other indications. Benzodiazepines are often prescribed for a wide range of conditions: Contraindications. Benzodiazepines require special precaution if used in the elderly, during pregnancy, in children, alcohol or drug-dependent individuals, and individuals with comorbid psychiatric disorders. Because of their muscle relaxant action, benzodiazepines may cause respiratory depression in susceptible individuals. For that reason, they are contraindicated in people with myasthenia gravis, sleep apnea, bronchitis, and COPD. Caution is required when benzodiazepines are used in people with personality disorders or intellectual disability because of frequent paradoxical reactions. In major depression, they may precipitate suicidal tendencies and are sometimes used for suicidal overdoses. Individuals with a history of excessive alcohol use or non-medical use of opioids or barbiturates should avoid benzodiazepines, as there is a risk of life-threatening interactions with these drugs. Pregnancy. In the United States, the Food and Drug Administration has categorized benzodiazepines into either category D or X, meaning the potential for harm to the unborn has been demonstrated. Exposure to benzodiazepines during pregnancy has been associated with a slightly increased (from 0.06 to 0.07%) risk of cleft palate in newborns, a controversial conclusion as some studies find no association between benzodiazepines and cleft palate. Their use by expectant mothers shortly before the delivery may result in a floppy infant syndrome. Newborns with this condition tend to have hypotonia, hypothermia, lethargy, and breathing and feeding difficulties. Cases of neonatal withdrawal syndrome have been described in infants chronically exposed to benzodiazepines in utero. This syndrome may be hard to recognize, as it starts several days after delivery, for example, as late as 21 days for chlordiazepoxide. The symptoms include tremors, hypertonia, hyperreflexia, hyperactivity, and vomiting and may last for up to three to six months. Tapering down the dose during pregnancy may lessen its severity. If used in pregnancy, those benzodiazepines with a better and longer safety record, such as diazepam or chlordiazepoxide, are recommended over potentially more harmful benzodiazepines, such as temazepam or triazolam. Using the lowest effective dose for the shortest period minimizes the risks to the unborn child. Elderly. The benefits of benzodiazepines are the least, and the risks are greatest in the elderly. They are listed as a potentially inappropriate medication for older adults by the American Geriatrics Society. The elderly are at an increased risk of dependence and are more sensitive to the adverse effects such as memory problems, daytime sedation, impaired motor coordination, and increased risk of motor vehicle accidents and falls, and an increased risk of hip fractures. The long-term effects of benzodiazepines and benzodiazepine dependence in the elderly can resemble dementia, depression, or anxiety syndromes, and progressively worsens over time. Adverse effects on cognition can be mistaken for the effects of old age. The benefits of withdrawal include improved cognition, alertness, mobility, reduced risk of incontinence, and a reduced risk of falls and fractures. The success of gradual-tapering benzodiazepines is as great in the elderly as in younger people. Benzodiazepines should be prescribed to the elderly only with caution and only for a short period at low doses. Short to intermediate-acting benzodiazepines are preferred in the elderly such as oxazepam and temazepam. The high potency benzodiazepines alprazolam and triazolam and long-acting benzodiazepines are not recommended in the elderly due to increased adverse effects. Nonbenzodiazepines such as zaleplon and zolpidem and low doses of sedating antidepressants are sometimes used as alternatives to benzodiazepines. Long-term use of benzodiazepines is associated with increased risk of cognitive impairment and dementia, and a reduction in prescribing levels is likely to reduce dementia risk. The association of a history of benzodiazepine use and cognitive decline is unclear, with some studies reporting a lower risk of cognitive decline in former users, some finding no association and some indicating an increased risk of cognitive decline. Benzodiazepines are sometimes prescribed to treat behavioral symptoms of dementia. However, like antidepressants, they have little evidence of effectiveness, although antipsychotics have shown some benefit. Cognitive impairing effects of benzodiazepines that occur frequently in the elderly can also worsen dementia. Adverse effects. The most common side effects of benzodiazepines are related to their sedating and muscle-relaxing action. They include drowsiness, dizziness, and decreased alertness and concentration. Lack of coordination may result in falls and injuries, particularly in the elderly. Another result is impairment of driving skills and increased likelihood of road traffic accidents. Decreased libido and erection problems are a common side effect. Depression and disinhibition may emerge. Hypotension and suppressed breathing (hypoventilation) may be encountered with intravenous use. Less common side effects include nausea and changes in appetite, blurred vision, confusion, euphoria, depersonalization and nightmares. Cases of liver toxicity have been described but are very rare. The long-term effects of benzodiazepine use can include cognitive impairment as well as affective and behavioural problems. Feelings of turmoil, difficulty in thinking constructively, loss of sex drive, agoraphobia and social phobia, increasing anxiety and depression, loss of interest in leisure pursuits and interests, and an inability to experience or express feelings can also occur. Not everyone, however, experiences problems with long-term use. Additionally, an altered perception of self, environment and relationships may occur. A study published in 2020 found that long-term use of prescription benzodiazepines is associated with an increase in all-cause mortality among those age 65 or younger, but not those older than 65. The study also found that all-cause mortality was increased further in cases in which benzodiazepines are co-prescribed with opioids, relative to cases in which benzodiazepines are prescribed without opioids, but again only in those age 65 or younger. Compared to other sedative-hypnotics, visits to the hospital involving benzodiazepines had a 66% greater odds of a serious adverse health outcome. This included hospitalization, patient transfer, or death, and visits involving a combination of benzodiazepines and non-benzodiazepine receptor agonists had almost four times increased odds of a serious health outcome. In September 2020, the US Food and Drug Administration (FDA) required the boxed warning to be updated for all benzodiazepine medicines to describe the risks of abuse, misuse, addiction, physical dependence, and withdrawal reactions consistently across all the medicines in the class. Cognitive effects. The short-term use of benzodiazepines adversely affects multiple areas of cognition, the most notable one being that it interferes with the formation and consolidation of memories of new material and may induce complete anterograde amnesia. However, researchers hold contrary opinions regarding the effects of long-term administration. One view is that many of the short-term effects continue into the long-term and may even worsen, and are not resolved after stopping benzodiazepine usage. Another view maintains that cognitive deficits in chronic benzodiazepine users occur only for a short period after the dose, or that the anxiety disorder is the cause of these deficits. While the definitive studies are lacking, the former view received support from a 2004 meta-analysis of 13 small studies. This meta-analysis found that long-term use of benzodiazepines was associated with moderate to large adverse effects on all areas of cognition, with visuospatial memory being the most commonly detected impairment. Some of the other impairments reported were decreased IQ, visiomotor coordination, information processing, verbal learning, and concentration. The authors of the meta-analysis and a later reviewer noted that the applicability of this meta-analysis is limited because the subjects were taken mostly from withdrawal clinics; the coexisting drug, alcohol use, and psychiatric disorders were not defined; and several of the included studies conducted the cognitive measurements during the withdrawal period. Paradoxical effects. Paradoxical reactions, such as increased seizures in epileptics, aggression, violence, impulsivity, irritability and suicidal behavior sometimes occur. These reactions have been explained as consequences of disinhibition and the subsequent loss of control over socially unacceptable behavior. Paradoxical reactions are rare in the general population, with an incidence rate below 1% and similar to placebo. However, they occur with greater frequency in recreational abusers, individuals with borderline personality disorder, children, and patients on high-dosage regimes. In these groups, impulse control problems are perhaps the most important risk factor for disinhibition; learning disabilities and neurological disorders are also significant risks. Most reports of disinhibition involve high doses of high-potency benzodiazepines. Paradoxical effects may also appear after chronic use of benzodiazepines. Long-term worsening of psychiatric symptoms. While benzodiazepines may have short-term benefits for anxiety, sleep, and agitation in some patients, long-term (i.e., greater than 2–4 weeks) use can result in a worsening of the very symptoms the medications are meant to treat. Potential explanations include exacerbating cognitive problems that are already common in anxiety disorders, causing or worsening depression and suicidality, disrupting sleep architecture by inhibiting deep stage sleep, withdrawal symptoms or rebound symptoms in between doses mimicking or exacerbating underlying anxiety or sleep disorders, inhibiting the benefits of psychotherapy by inhibiting memory consolidation and reducing fear extinction, and reducing coping with trauma/stress and increasing vulnerability to future stress. The latter two explanations may be why benzodiazepines are ineffective and/or potentially harmful in PTSD and phobias. Anxiety, insomnia and irritability may be temporarily exacerbated during withdrawal, but psychiatric symptoms after discontinuation are usually less than even while taking benzodiazepines. Functioning significantly improves within 1 year of discontinuation. Physical dependence, withdrawal, and post-withdrawal syndromes. Tolerance. Tolerance and dependence are risks of chronic benzodiazepine use, and can result in doses within the therapeutic range ceasing to offer meaningful symptomatic relief after prolonged use. Tolerance develops at different rates and to different degrees to the sedative, hypnotic, anticonvulsant, muscle relaxant, and anxiolytic effects of benzodiazepines. A review of benzodiazepine tolerance concluded that it ""appears that tolerance develops relatively quickly for the sedative and anticonvulsant actions of benzodiazepines, whereas tolerance to anxiolytic and amnesic effects probably does not develop at all"," although the included randomized controlled trial evidence is limited to 22 weeks. A review of clonazepam in the treatment of psychiatric disorders concluded that longitudinal data supports anxiolytic benefit without tolerance during long-term use, including an open-label study finding continued benefit at 3 years. However, the review concludes that long-term RCT evidence is scant. A study of benzodiazepine sensitivity found that patients treated chronically with alprazolam did not differ from untreated patients in terms of anxiolytic response to diazepam, suggesting a lack of anxiolytic tolerance. However, controversy remains regarding tolerance to anxiolytic effects, with some publications reporting that there is little evidence of continued efficacy beyond 4–6 months or that dependence phenomena are common. However, some of these references were published before the in-depth scoping reviews of benzodiazepine tolerance, and lack citations of RCT evidence of tolerance. Studies reporting on voluntary benzodiazepine cessation and withdrawal include patient reports of tolerance and worsening anxiety. The question of tolerance to the amnesic effects of benzodiazepines is, likewise, unclear. Some evidence suggests that partial tolerance does develop, and that, "memory impairment is limited to a narrow window within 90 minutes after each dose". A major disadvantage of benzodiazepines is that tolerance to therapeutic effects develops relatively quickly, while many adverse effects persist. Tolerance develops to hypnotic and myorelaxant effects within days to weeks, and to anticonvulsant effects within weeks to months. Therefore, benzodiazepines are unlikely to be effective long-term treatments for sleep. While BZD therapeutic effects may disappear with tolerance, depression and impulsivity with high suicidal risk commonly persist. Several studies have confirmed that long-term benzodiazepines are not significantly different from placebo for sleep, and question their use for anxiety disorders such as PTSD and OCD. This may explain why patients commonly increase doses over time and many eventually take more than one type of benzodiazepine after the first loses effectiveness. Additionally, because tolerance to benzodiazepine sedating effects develops more quickly than does tolerance to brainstem depressant effects, those taking more benzodiazepines to achieve desired effects may experience sudden respiratory depression, hypotension or death. Most patients with anxiety disorders and PTSD have symptoms that persist for at least several months, making tolerance to therapeutic effects a distinct problem for them and necessitating the need for more effective long-term treatment (e.g., psychotherapy, serotonergic antidepressants). Withdrawal symptoms and management. Discontinuation of benzodiazepines or abrupt reduction of the dose, even after a relatively short course of treatment (two to four weeks), may result in two groups of symptoms: rebound and withdrawal. Rebound symptoms are the return of the symptoms for which the patient was treated, but worse than before. Withdrawal symptoms are the new symptoms that occur when the benzodiazepine is stopped. They are the main sign of physical dependence. The American Society of Addiction Medicine (ASAM), in collaboration with ten other American medical associations, issued a joint statement regarding benzodiazepine tapering in June 2025. They recommend that, for most patients dependent on benzodiazepines, the initial pace of the taper should generally include dose reductions of 5 to 10% every 2–4 weeks. The taper should not exceed 25% every 2 weeks. They emphasized that a taper that is too rapid can be dangerous and potentially life-threatening. The most frequent symptoms of withdrawal from benzodiazepines are insomnia, gastric problems, tremors, agitation, fearfulness, and muscle spasms. The less frequent effects are irritability, sweating, depersonalization, derealization, hypersensitivity to stimuli, depression, suicidal behavior, psychosis, seizures, and delirium tremens. Severe symptoms usually occur as a result of abrupt or over-rapid withdrawal. Abrupt withdrawal can be dangerous and lead to excitotoxicity, causing damage and even death to nerve cells as a result of excessive levels of the excitatory neurotransmitter glutamate. Increased glutamatergic activity is thought to be part of a compensatory mechanism to chronic GABAergic inhibition from benzodiazepines. Therefore, a gradual reduction regimen is recommended. Symptoms may also occur during a gradual dosage reduction, but are typically less severe and may persist as part of a protracted withdrawal syndrome for months after cessation of benzodiazepines. Approximately 10% of patients experience a notable protracted withdrawal syndrome, which can persist for many months or in some cases a year or longer. Protracted symptoms tend to resemble those seen during the first couple of months of withdrawal, but usually are of a sub-acute level of severity. Such symptoms do gradually lessen over time, eventually disappearing altogether. Benzodiazepines have a reputation with patients and doctors for causing a severe and traumatic withdrawal; however, this is in large part due to the withdrawal process being poorly managed. Over-rapid withdrawal from benzodiazepines increases the severity of the withdrawal syndrome and increases the failure rate. A slow and gradual withdrawal customised to the individual and, if indicated, psychological support is the most effective way of managing the withdrawal. Opinion as to the time needed to complete withdrawal ranges from four weeks to several years. A goal of less than six months has been suggested, but due to factors such as dosage and type of benzodiazepine, reasons for prescription, lifestyle, personality, environmental stresses, and amount of available support, a year or more may be needed to withdraw. Withdrawal is best managed by transferring the physically dependent patient to an equivalent dose of diazepam because it has the longest half-life of all of the benzodiazepines, is metabolised into long-acting active metabolites, and is available in low-potency tablets, which can be quartered for smaller doses. A further benefit is that it is available in liquid form, which allows for even smaller reductions. Chlordiazepoxide, which also has a long half-life and long-acting active metabolites, can be used as an alternative. Nonbenzodiazepines are contraindicated during benzodiazepine withdrawal as they are cross tolerant with benzodiazepines and can induce dependence. Alcohol is also cross tolerant with benzodiazepines and more toxic and thus caution is needed to avoid replacing one dependence with another. During withdrawal, fluoroquinolone-based antibiotics are best avoided if possible; they displace benzodiazepines from their binding site and reduce GABA function and, thus, may aggravate withdrawal symptoms. Antipsychotics are not recommended for benzodiazepine withdrawal (or other CNS depressant withdrawal states) especially clozapine, olanzapine or low potency phenothiazines, e.g., chlorpromazine as they lower the seizure threshold and can worsen withdrawal effects; if used extreme caution is required. Withdrawal from long-term benzodiazepines is beneficial for most individuals. Withdrawal of benzodiazepines from long-term users, in general, leads to improved physical and mental health, particularly in the elderly; although some long-term users report continued benefit from taking benzodiazepines, this may be the result of suppression of withdrawal effects. Controversial associations. Beyond the well established link between benzodiazepines and psychomotor impairment resulting in motor vehicle accidents and falls leading to fracture; research in the 2000s and 2010s has raised the association between benzodiazepines (and Z-drugs) and other, as of yet unproven, adverse effects including dementia, cancer, infections, pancreatitis and respiratory disease exacerbations. Dementia. Several studies have drawn an association between long-term benzodiazepine use and neurodegenerative disease, particularly Alzheimer's disease. It has been determined that long-term use of benzodiazepines is associated with increased dementia risk, even after controlling for protopathic bias. In contrast, a clinical study using florbetapir PET-CT and MRI to compute beta-amyloid load and hippocampal volume, respectively, found benzodiazepines to be associated with decreased beta-amyloid load and reduced hippocampal atrophy in a cohort of older nondemented adults with isolated/light cognitive impairments. Unlike previous imaging studies limited to MRI techniques, the utilization of a dual PET CT scanner to track the binding of the radioligand florbetapir to beta-amyloid provides stronger neuroimaging data with respect to alzheimer’s related pathophysiology. Indeed, Avid’s florbetapir PET technique has received FDA approval for diagnosing alzheimer’s disease. However, it is important to note that there is still no causal evidence regarding how benzodiazepines affect the risk of alzheimer’s disease or dementia. Infections. Some observational studies have detected significant associations between benzodiazepines and respiratory infections such as pneumonia where others have not. A large meta-analysis of pre-marketing randomized controlled trials on the pharmacologically related Z-Drugs suggest a small increase in infection risk as well. An immunodeficiency effect from the action of benzodiazepines on GABA-A receptors has been postulated from animal studies. Cancer. A meta-analysis of observational studies has determined an association between benzodiazepine use and cancer, though the risk across different agents and different cancers varied significantly. In terms of experimental basic science evidence, an analysis of carcinogenetic and genotoxicity data for various benzodiazepines has suggested a small possibility of carcinogenesis for a small number of benzodiazepines. Pancreatitis. The evidence suggesting a link between benzodiazepines (and Z-Drugs) and pancreatic inflammation is very sparse and limited to a few observational studies from Taiwan. A criticism of confounding can be applied to these findings as with the other controversial associations above. Further well-designed research from other populations, as well as a biologically plausible mechanism, is required to confirm this association. Overdose. Although benzodiazepines are much safer in overdose than their predecessors, the barbiturates, they can still cause problems in overdose. Taken alone, they rarely cause severe complications in overdose; statistics in England showed that benzodiazepines were responsible for 3.8% of all deaths by poisoning from a single drug. However, combining these drugs with alcohol, opiates or tricyclic antidepressants markedly raises the toxicity. The elderly are more sensitive to the side effects of benzodiazepines, and poisoning may even occur from their long-term use. The various benzodiazepines differ in their toxicity; temazepam appears most toxic in overdose and when used with other drugs. The symptoms of a benzodiazepine overdose may include; drowsiness, slurred speech, nystagmus, hypotension, ataxia, coma, respiratory depression, and cardiorespiratory arrest. A reversal agent for benzodiazepines exists, flumazenil (Anexate), itself belonging to the chemical class of benzodiazepines. Its use as an antidote is not routinely recommended because of the high risk of resedation and seizures. In a double-blind, placebo-controlled trial of 326 people, 4 people had serious adverse events and 61% became resedated following the use of flumazenil. Numerous contraindications to its use exist. It is contraindicated in people with a history of long-term use of benzodiazepines, those having ingested a substance that lowers the seizure threshold or may cause an arrhythmia, and in those with abnormal vital signs. One study found that only 10% of the people presenting with a benzodiazepine overdose are suitable candidates for treatment with flumazenil. Interactions. Individual benzodiazepines may have different interactions with certain drugs. Depending on their metabolism pathway, benzodiazepines can be divided roughly into two groups. The largest group consists of those that are metabolized by cytochrome P450 (CYP450) enzymes and possess significant potential for interactions with other drugs. The other group comprises those that are metabolized through glucuronidation, such as lorazepam, oxazepam, and temazepam, and, in general, have few drug interactions. Many drugs, including oral contraceptives, some antibiotics, antidepressants, and antifungal agents, inhibit cytochrome enzymes in the liver. They reduce the rate of elimination of the benzodiazepines that are metabolized by CYP450, leading to possibly excessive drug accumulation and increased side effects. In contrast, drugs that induce cytochrome P450 enzymes, such as St John's wort, the antibiotic rifampicin, and the anticonvulsants carbamazepine and phenytoin, accelerate elimination of many benzodiazepines and decrease their action. Taking benzodiazepines with alcohol, opioids and other central nervous system depressants potentiates their action. This often results in increased sedation, impaired motor coordination, suppressed breathing, and other adverse effects that have the potential to be lethal. Antacids can slow down absorption of some benzodiazepines; however, this effect is marginal and inconsistent. Pharmacology. Pharmacodynamics. Benzodiazepines work by increasing the effectiveness of the endogenous chemical, GABA, to decrease the excitability of neurons. This reduces the communication between neurons and, therefore, has a calming effect on many of the functions of the brain. GABA controls the excitability of neurons by binding to the GABAA receptor. The GABAA receptor is a protein complex located in the synapses between neurons. All GABAA receptors contain an ion channel that conducts chloride ions across neuronal cell membranes and two binding sites for the neurotransmitter gamma-aminobutyric acid (GABA), while a subset of GABAA receptor complexes also contain a single binding site for benzodiazepines. Binding of benzodiazepines to this receptor complex does not alter binding of GABA. Unlike other positive allosteric modulators that increase ligand binding, benzodiazepine binding acts as a positive allosteric modulator by increasing the total conduction of chloride ions across the neuronal cell membrane when GABA is already bound to its receptor. This increased chloride ion influx hyperpolarizes the neuron's membrane potential. As a result, the difference between resting potential and threshold potential is increased, and firing is less likely. Different GABAA receptor subtypes have varying distributions within different regions of the brain and, therefore, control distinct neuronal circuits. Hence, activation of different GABAA receptor subtypes by benzodiazepines may result in distinct pharmacological actions. In terms of the mechanism of action of benzodiazepines, their similarities are too great to separate them into individual categories such as anxiolytic or hypnotic. For example, a hypnotic administered in low doses produces anxiety-relieving effects, whereas a benzodiazepine marketed as an anti-anxiety drug at higher doses induces sleep. The subset of GABAA receptors that also bind benzodiazepines are referred to as benzodiazepine receptors (BzR). The GABAA receptor is a heteromer composed of five subunits, the most common ones being two "α"s, two "β"s, and one "γ" (α2β2γ1). For each subunit, many subtypes exist (α1–6, β1–3, and γ1–3). GABAA receptors that are made up of different combinations of subunit subtypes have different properties, different distributions in the brain, and different activities relative to pharmacological and clinical effects. Benzodiazepines bind at the interface of the α and γ subunits on the GABAA receptor. Binding also requires that alpha subunits contain a histidine amino acid residue, (i.e., α1, α2, α3, and α5 containing GABAA receptors). For this reason, benzodiazepines show no affinity for GABAA receptors containing α4 and α6 subunits with an arginine instead of a histidine residue. Once bound to the benzodiazepine receptor, the benzodiazepine ligand locks the benzodiazepine receptor into a conformation in which it has a greater affinity for the GABA neurotransmitter. This increases the frequency of the opening of the associated chloride ion channel and hyperpolarizes the membrane of the associated neuron. The inhibitory effect of the available GABA is potentiated, leading to sedative and anxiolytic effects. For instance, those ligands with high activity at the α1 are associated with stronger hypnotic effects, whereas those with higher affinity for GABAA receptors containing α2 and/or α3 subunits have good anti-anxiety activity. GABAA receptors participate in the regulation of synaptic pruning by prompting microglial spine engulfment. Benzodiazepines have been shown to upregulate microglial spine engulfment and prompt overzealous eradication of synaptic connections. This mechanism may help explain the increased risk of dementia associated with long-term benzodiazepine treatment. The benzodiazepine class of drugs also interacts with peripheral benzodiazepine receptors. Peripheral benzodiazepine receptors are present in peripheral nervous system tissues, glial cells, and to a lesser extent the central nervous system. These peripheral receptors are not structurally related or coupled to GABAA receptors. They modulate the immune system and are involved in the body's response to injury. Benzodiazepines also function as weak adenosine reuptake inhibitors. It has been suggested that some of their anticonvulsant, anxiolytic, and muscle relaxant effects may be in part mediated by this action. Benzodiazepines have binding sites in the periphery, however their effects on muscle tone is not mediated through these peripheral receptors. The peripheral binding sites for benzodiazepines are present in immune cells and the gastrointestinal tract. Pharmacokinetics. A benzodiazepine can be placed into one of three groups by its elimination half-life, or the time it takes for the body to eliminate half of the dose. Some benzodiazepines have long-acting active metabolites, such as diazepam and chlordiazepoxide, which are metabolised into desmethyldiazepam. Desmethyldiazepam has a half-life of 36–200 hours, and flurazepam, with the main active metabolite of desalkylflurazepam, with a half-life of 40–250 hours. These long-acting metabolites are partial agonists. Chemistry. Benzodiazepines share a similar chemical structure, and their effects in humans are mainly produced by the allosteric modification of a specific kind of neurotransmitter receptor, the GABAA receptor, which increases the overall conductance of these inhibitory channels; this results in the various therapeutic effects as well as adverse effects of benzodiazepines. Other less important modes of action are also known. The term "benzodiazepine" is the chemical name for the heterocyclic ring system (see figure to the right), which is a fusion between the benzene and diazepine ring systems. Under Hantzsch–Widman nomenclature, a diazepine is a heterocycle with two nitrogen atoms, five carbon atom and the maximum possible number of cumulative double bonds. The "benzo" prefix indicates the benzene ring fused onto the diazepine ring. Benzodiazepine drugs are substituted 1,4-benzodiazepines, although the chemical term can refer to many other compounds that do not have useful pharmacological properties. Different benzodiazepine drugs have different side groups attached to this central structure. The different side groups affect the binding of the molecule to the GABAA receptor and so modulate the pharmacological properties. Many of the pharmacologically active "classical" benzodiazepine drugs contain the 5-phenyl-1"H"-benzo["e"] [1,4]diazepin-2(3"H")-one substructure (see figure to the right). Benzodiazepines have been found to mimic protein reverse turns structurally, which enables them to exert their biological activity in many cases. Nonbenzodiazepines also bind to the benzodiazepine binding site on the GABAA receptor and possess similar pharmacological properties. While the nonbenzodiazepines are by definition structurally unrelated to the benzodiazepines, both classes of drugs possess a common pharmacophore (see figure to the lower-right), which explains their binding to a common receptor site. Not all benzodiazepines increase the conductance of the GABAA receptor. Flumazenil, an imidazobenzodiazepine, is an antidote for some benzodiazepine overdoses. The structual scaffold can even be used to target receptors other than GABAA. clorazepate, diazepam, flurazepam, halazepam, prazepam, and others lorazepam, lormetazepam, oxazepam, temazepam clonazepam, flunitrazepam, nimetazepam, nitrazepam adinazolam, alprazolam, estazolam, triazolam climazolam, loprazolam, midazolam clobazam History. The first benzodiazepine, chlordiazepoxide ("Librium"), was synthesized in 1955 by Leo Sternbach while working at Hoffmann–La Roche on the development of tranquilizers. The pharmacological properties of the compounds prepared initially were disappointing, and Sternbach abandoned the project. Two years later, in April 1957, co-worker Earl Reeder noticed a "nicely crystalline" compound left over from the discontinued project while spring-cleaning in the lab. This compound, later named chlordiazepoxide, had not been tested in 1955 because of Sternbach's focus on other issues. Expecting pharmacology results to be negative and hoping to publish the chemistry-related findings, researchers submitted it for a standard battery of animal tests. The compound showed very strong sedative, anticonvulsant, and muscle relaxant effects. These impressive clinical findings led to its speedy introduction throughout the world in 1960 under the brand name "Librium". Following chlordiazepoxide, diazepam marketed by Hoffmann–La Roche under the brand name "Valium" in 1963, and for a while the two were the most commercially successful drugs. The introduction of benzodiazepines led to a decrease in the prescription of barbiturates, and by the 1970s, they had largely replaced the older drugs for sedative and hypnotic uses. The new group of drugs was initially greeted with optimism by the medical profession, but gradually, concerns arose; in particular, the risk of dependence became evident in the 1980s. Benzodiazepines have a unique history in that they were responsible for the largest-ever class-action lawsuit against drug manufacturers in the United Kingdom, involving 14,000 patients and 1,800 law firms that alleged the manufacturers knew of the dependence potential but intentionally withheld this information from doctors. At the same time, 117 general practitioners and 50 health authorities were sued by patients to recover damages for the harmful effects of dependence and withdrawal. This led some doctors to require a signed consent form from their patients and to recommend that all patients be adequately warned of the risks of dependence and withdrawal before starting treatment with benzodiazepines. The court case against the drug manufacturers never reached a verdict; legal aid had been withdrawn and there were allegations that the consultant psychiatrists, the expert witnesses, had a conflict of interest. The court case fell through, at a cost of £30 million, and led to more cautious funding through legal aid for future cases. This made future class action lawsuits less likely to succeed, due to the high cost from financing a smaller number of cases, and increasing charges for losing the case for each person involved. Although antidepressants with anxiolytic properties have been introduced, and there is increasing awareness of the adverse effects of benzodiazepines, prescriptions for short-term anxiety relief have not significantly dropped. For treatment of insomnia, benzodiazepines are now less popular than nonbenzodiazepines, which include zolpidem, zaleplon and eszopiclone. Nonbenzodiazepines are molecularly distinct, but nonetheless, they work on the same benzodiazepine receptors and produce similar sedative effects. Benzodiazepines have been detected in plant specimens and brain samples of animals not exposed to synthetic sources, including a human brain from the 1940s. However, it is unclear whether these compounds are biosynthesized by microbes or by plants and animals themselves. A microbial biosynthetic pathway has been proposed. Society and culture. Legal status. In the United States, benzodiazepines are Schedule IV drugs under the Federal Controlled Substances Act, even when not on the market (for example, flunitrazepam), with the exception of flualprazolam, etizolam, clonazolam, flubromazolam, and diclazepam which are placed in Schedule I. In Canada, possession of benzodiazepines is legal for personal use. All benzodiazepines are categorized as Schedule IV substances under the Controlled Drugs and Substances Act. In the United Kingdom, benzodiazepines are Class C controlled drugs, carrying the maximum penalty of 7 years imprisonment, an unlimited fine, or both for possession and a maximum penalty of 14 years imprisonment, an unlimited fine, or both for supplying benzodiazepines to others. In the Netherlands, since October 1993, benzodiazepines, including formulations containing less than 20 mg of temazepam, are all placed on List 2 of the Opium Law. A prescription is needed for possession of all benzodiazepines. Temazepam formulations containing 20 mg or greater of the drug are placed on List 1, thus requiring doctors to write prescriptions in the List 1 format. In East Asia and Southeast Asia, temazepam and nimetazepam are often heavily controlled and restricted. In certain countries, triazolam, flunitrazepam, flutoprazepam and midazolam are also restricted or controlled to certain degrees. In Hong Kong, all benzodiazepines are regulated under Schedule 1 of Hong Kong's Chapter 134 "Dangerous Drugs Ordinance". Previously only brotizolam, flunitrazepam and triazolam were classed as dangerous drugs. Internationally, benzodiazepines are categorized as Schedule IV controlled drugs, apart from flunitrazepam, which is a Schedule III drug under the Convention on Psychotropic Substances. Recreational use. Benzodiazepines are considered major addictive substances. Non-medical benzodiazepine use is mostly limited to individuals who use other substances, i.e., people who engage in polysubstance use. On the international scene, benzodiazepines are categorized as Schedule IV controlled drugs by the INCB, apart from flunitrazepam, which is a Schedule III drug under the Convention on Psychotropic Substances. Some variation in drug scheduling exists in individual countries; for example, in the United Kingdom, midazolam and temazepam are Schedule III controlled drugs. British law requires that temazepam (but "not" midazolam) be stored in safe custody. Safe custody requirements ensures that pharmacists and doctors holding stock of temazepam must store it in securely fixed double-locked steel safety cabinets and maintain a written register, which must be bound and contain separate entries for temazepam and must be written in ink with no use of correction fluid (although a written register is not required for temazepam in the United Kingdom). Disposal of expired stock must be witnessed by a designated inspector (either a local drug-enforcement police officer or official from health authority). Benzodiazepine use ranges from occasional binges on large doses, to chronic and compulsive drug use of high doses. Benzodiazepines are commonly used recreationally by people who engage in polysubstance use. Mortality is higher among people with polysubstance use who also use benzodiazepines. Heavy alcohol use also increases mortality among people who engage in polysubstance use. Polydrug use involving benzodiazepines and alcohol can result in an increased risk of blackouts, risk-taking behaviours, seizures, and overdose. Dependence and tolerance, often coupled with dosage escalation, to benzodiazepines can develop rapidly among people who misuse drugs; withdrawal syndrome may appear after as little as three weeks of continuous use. Long-term use has the potential to cause both physical and psychological dependence and severe withdrawal symptoms such as depression, anxiety (often to the point of panic attacks), and agoraphobia. Benzodiazepines and, in particular, temazepam are sometimes used intravenously, which, if done incorrectly or in an unsterile manner, can lead to medical complications including abscesses, cellulitis, thrombophlebitis, arterial puncture, deep vein thrombosis, and gangrene. Sharing syringes and needles for this purpose also brings up the possibility of transmission of hepatitis, HIV, and other diseases. Benzodiazepines are also misused intranasally, which may have additional health consequences. Once benzodiazepine dependence has been established, a clinician usually converts the patient to an equivalent dose of diazepam before beginning a gradual reduction program. A 1999–2005 Australian police survey of detainees reported preliminary findings that self-reported users of benzodiazepines were less likely than non-user detainees to work full-time and more likely to receive government benefits, use methamphetamine or heroin, and be arrested or imprisoned. Benzodiazepines are sometimes used for criminal purposes; they serve to incapacitate a victim in cases of drug assisted rape or robbery. Overall, anecdotal evidence suggests that temazepam may be the most psychologically habit-forming (addictive) benzodiazepine. Non-medical temazepam use reached epidemic proportions in some parts of the world, in particular, in Europe and Australia, and is a major addictive substance in many Southeast Asian countries. This led authorities of various countries to place temazepam under a more restrictive legal status. Some countries, such as Sweden, banned the drug outright. Temazepam also has certain pharmacokinetic properties of absorption, distribution, elimination, and clearance that make it more apt to non-medical use compared to many other benzodiazepines. Veterinary use. Benzodiazepines are used in veterinary practice in the treatment of various disorders and conditions. As in humans, they are used in the first-line management of seizures, status epilepticus, and tetanus, and as maintenance therapy in epilepsy (in particular, in cats). They are widely used in small and large animals (including horses, swine, cattle and exotic and wild animals) for their anxiolytic and sedative effects, as pre-medication before surgery, for induction of anesthesia and as adjuncts to anesthesia.
4788
27823944
https://en.wikipedia.org/wiki?curid=4788
Body mass index
Body mass index (BMI) is a value derived from the mass (weight) and height of a person. The BMI is defined as the body mass divided by the square of the body height, and is expressed in units of kg/m2, resulting from mass in kilograms (kg) and height in metres (m). The BMI may be determined first by measuring its components by means of a weighing scale and a stadiometer. The multiplication and division may be carried out directly, by hand or using a calculator, or indirectly using a lookup table (or chart). The table displays BMI as a function of mass and height and may show other units of measurement (converted to metric units for the calculation). The table may also show contour lines or colours for different BMI categories. The BMI is a convenient rule of thumb used to broadly categorize a person as based on tissue mass (muscle, fat, and bone) and height. Major adult BMI classifications are "underweight" (under 18.5 kg/m2), "normal weight" (18.5 to 24.9), "overweight" (25 to 29.9), and "obese" (30 or more). When used to predict an individual's health, rather than as a statistical measurement for groups, the BMI has limitations that can make it less useful than some of the alternatives, especially when applied to individuals with abdominal obesity, short stature, or high muscle mass. BMIs under 20 and over 25 have been associated with higher all-cause mortality, with the risk increasing with distance from the 20–25 range. History. Adolphe Quetelet, a Belgian astronomer, mathematician, statistician, and sociologist, devised the basis of the BMI between 1830 and 1850 as he developed what he called "social physics". Quetelet himself never intended for the index, then called the Quetelet Index, to be used as a means of medical assessment. Instead, it was a component of his study of , or the average man. Quetelet thought of the average man as a social ideal, and developed the body mass index as a means of discovering the socially ideal human person. According to Lars Grue and Arvid Heiberg in the Scandinavian Journal of Disability Research, Quetelet's idealization of the average man would be elaborated upon by Francis Galton a decade later in the development of Eugenics. The modern term "body mass index" (BMI) for the ratio of human body weight to squared height was coined in a paper published in the July 1972 edition of the "Journal of Chronic Diseases" by Ancel Keys and others. In this paper, Keys argued that what he termed the BMI was "if not fully satisfactory, at least as good as any other relative weight index as an indicator of relative obesity". The interest in an index that measures body fat came with observed increasing obesity in prosperous Western societies. Keys explicitly judged BMI as appropriate for "population" studies and inappropriate for individual evaluation. Nevertheless, due to its simplicity, it has come to be widely used for preliminary diagnoses. Additional metrics, such as waist circumference, can be more useful. The BMI is expressed in kg/m2, resulting from mass in kilograms and height in metres. If pounds and inches are used, a conversion factor of 703 (kg/m2)/(lb/in2) is applied. (If pounds and feet are used, a conversion factor of 4.88 is used.) When the term BMI is used informally, the units are usually omitted. formula_1 BMI provides a simple numeric measure of a person's "thickness" or "thinness", allowing health professionals to discuss weight problems more objectively with their patients. BMI was designed to be used as a simple means of classifying average sedentary (physically inactive) populations, with an average body composition. For such individuals, the BMI value recommendations are as follows: 18.5 to 24.9 kg/m2 may indicate optimal weight, lower than 18.5 may indicate underweight, 25 to 29.9 may indicate overweight, and 30 or more may indicate obese. Lean male athletes often have a high muscle-to-fat ratio and therefore a BMI that is misleadingly high relative to their body-fat percentage. Categories. A common use of the BMI is to assess how far an individual's body weight departs from what is normal for a person's height. The weight excess or deficiency may, in part, be accounted for by body fat (adipose tissue) although other factors such as muscularity also affect BMI significantly (see discussion below and overweight). The WHO regards an adult BMI of less than 18.5 as underweight and possibly indicative of malnutrition, an eating disorder, or other health problems, while a BMI of 25 or more is considered overweight and 30 or more is considered obese. In addition to the principle, international WHO BMI cut-off points (16, 17, 18.5, 25, 30, 35 and 40), four additional cut-off points for at-risk Asians were identified (23, 27.5, 32.5 and 37.5). These ranges of BMI values are valid only as statistical categories. Children and youth. BMI is used differently for people aged 2 to 20. It is calculated in the same way as for adults but then compared to typical values for other children or youth of the same age. Instead of comparison against fixed thresholds for underweight and overweight, the BMI is compared against the percentiles for children of the same sex and age. A BMI that is less than the 5th percentile is considered underweight and above the 95th percentile is considered obese. Children with a BMI between the 85th and 95th percentile are considered to be overweight. Studies in Britain from 2013 have indicated that females between the ages 12 and 16 had a higher BMI than males of the same age by 1.0 kg/m2 on average. International variations. These recommended distinctions along the linear scale may vary from time to time and country to country, making global, longitudinal surveys problematic. People from different populations and descent have different associations between BMI, percentage of body fat, and health risks, with a higher risk of type 2 diabetes mellitus and atherosclerotic cardiovascular disease at BMIs lower than the WHO cut-off point for overweight, 25 kg/m2, although the cut-off for observed risk varies among different populations. The cut-off for observed risk varies based on populations and subpopulations in Europe, Asia and Africa. Hong Kong. The Hospital Authority of Hong Kong recommends the use of the following BMI ranges: Japan. A 2000 study from the Japan Society for the Study of Obesity (JASSO) presents the following table of BMI categories: Singapore. In Singapore, the BMI cut-off figures were revised in 2005 by the Health Promotion Board (HPB), motivated by studies showing that many Asian populations, including Singaporeans, have a higher proportion of body fat and increased risk for cardiovascular diseases and diabetes mellitus, compared with general BMI recommendations in other countries. The BMI cut-offs are presented with an emphasis on health risk rather than weight. United Kingdom. In the UK, NICE guidance recommends prevention of type 2 diabetes should start at a BMI of 30 in White and 27.5 in Black African, African-Caribbean, South Asian, and Chinese populations. Research since 2021 based on a large sample of almost 1.5 million people in England found that some ethnic groups would benefit from prevention at or above a BMI of (rounded): United States. In 1998, the U.S. National Institutes of Health brought U.S. definitions in line with World Health Organization guidelines, lowering the normal/overweight cut-off from a BMI of 27.8 (men) and 27.3 (women) to a BMI of 25. This had the effect of redefining approximately 25 million Americans, previously "healthy", to "overweight". This can partially explain the increase in the "overweight" diagnosis in the past 20 years, and the increase in sales of weight loss products during the same time. WHO also recommends lowering the normal/overweight threshold for southeast Asian body types to around BMI 23, and expects further revisions to emerge from clinical studies of different body types. A survey in 2007 showed 63% of Americans were then overweight or obese, with 26% in the obese category (a BMI of 30 or more). By 2014, 37.7% of adults in the United States were obese, 35.0% of men and 40.4% of women; class 3 obesity (BMI over 40) values were 7.7% for men and 9.9% for women. The U.S. National Health and Nutrition Examination Survey of 2015–2016 showed that 71.6% of American men and women had BMIs over 25. Obesity—a BMI of 30 or more—was found in 39.8% of the US adults. Consequences of elevated level in adults. The BMI ranges are based on the relationship between body weight and disease and death. Overweight and obese individuals are at an increased risk for the following diseases: Among people who have never smoked, overweight/obesity is associated with 51% increase in mortality compared with people who have always been a normal weight. Applications. Public health. The BMI is generally used as a means of correlation between groups related by general mass and can serve as a vague means of estimating adiposity. The duality of the BMI is that, while it is easy to use as a general calculation, it is limited as to how accurate and pertinent the data obtained from it can be. Generally, the index is suitable for recognizing trends within sedentary or overweight individuals because there is a smaller margin of error. The BMI has been used by the WHO as the standard for recording obesity statistics since the early 1980s. This general correlation is particularly useful for consensus data regarding obesity or various other conditions because it can be used to build a semi-accurate representation from which a solution can be stipulated, or the RDA for a group can be calculated. Similarly, this is becoming more and more pertinent to the growth of children, since the majority of children are sedentary. Cross-sectional studies indicated that sedentary people can decrease BMI by becoming more physically active. Smaller effects are seen in prospective cohort studies which lend to support active mobility as a means to prevent a further increase in BMI. Legislation. In France, Italy, and Spain, legislation has been introduced banning the usage of fashion show models having a BMI below 18. In Israel, a model with BMI below 18.5 is banned. This is done to fight anorexia among models and people interested in fashion. Relationship to health. A study published by "Journal of the American Medical Association" ("JAMA") in 2005 showed that "overweight" people had a death rate similar to "normal" weight people as defined by BMI, while "underweight" and "obese" people had a higher death rate. A study published by "The Lancet" in 2009 involving 900,000 adults showed that "overweight" and "underweight" people both had a mortality rate higher than "normal" weight people as defined by BMI. The optimal BMI was found to be in the range of 22.5–25. The average BMI of athletes is 22.4 for women and 23.6 for men. High BMI is associated with type 2 diabetes only in people with high serum gamma-glutamyl transpeptidase. In an analysis of 40 studies involving 250,000 people, patients with coronary artery disease with "normal" BMIs were at higher risk of death from cardiovascular disease than people whose BMIs put them in the "overweight" range (BMI 25–29.9). One study found that BMI had a good general correlation with body fat percentage, and noted that obesity has overtaken smoking as the world's number one cause of death. But it also notes that in the study 50% of men and 62% of women were obese according to body fat defined obesity, while only 21% of men and 31% of women were obese according to BMI, meaning that BMI was found to underestimate the number of obese subjects. A 2010 study that followed 11,000 subjects for up to eight years concluded that BMI is not the most appropriate measure for the risk of heart attack, stroke or death. A better measure was found to be the waist-to-height ratio. A 2011 study that followed 60,000 participants for up to 13 years found that waist–hip ratio was a better predictor of ischaemic heart disease mortality. Limitations. The medical establishment and statistical community have both highlighted the limitations of BMI. Racial and gender differences. Part of the statistical limitations of the BMI scale is the result of Quetelet's original sampling methods. As noted in his primary work, "A Treatise on Man and the Development of His Faculties", the data from which Quetelet derived his formula was taken mostly from Scottish Highland soldiers and French Gendarmerie. The BMI was always designed as a metric for European men. For women, and people of non-European origin, the scale is often biased. As noted by sociologist Sabrina Strings, the BMI is largely inaccurate for black people especially, disproportionately labelling them as overweight even for healthy individuals. A 2012 study of BMI in an ethnically diverse population showed that "adult overweight and obesity were associated with an increased risk of mortality ... across the five racial/ethnic groups". Scaling. The BMI depends upon weight and the "square" of height. Since mass increases to the "third power" of linear dimensions, taller individuals with exactly the same body shape and relative composition have a larger BMI. BMI is proportional to the mass and inversely proportional to the square of the height. So, if all body dimensions double, and mass scales naturally with the cube of the height, then BMI doubles instead of remaining the same. This results in taller people having a reported BMI that is uncharacteristically high, compared to their actual body fat levels. In comparison, the Ponderal index is based on the natural scaling of mass with the third power of the height. However, many taller people are not just "scaled up" short people but tend to have narrower frames in proportion to their height. Carl Lavie has written that "The B.M.I. tables are excellent for identifying obesity and body fat in large populations, but they are far less reliable for determining fatness in individuals." For US adults, exponent estimates range from 1.92 to 1.96 for males and from 1.45 to 1.95 for females. Physical characteristics. The BMI overestimates roughly 10% for a large (or tall) frame and underestimates roughly 10% for a smaller frame (short stature). In other words, people with small frames would be carrying more fat than optimal, but their BMI indicates that they are "normal". Conversely, large framed (or tall) individuals may be quite healthy, with a fairly low body fat percentage, but be classified as "overweight" by BMI. For example, a height/weight chart may say the ideal weight (BMI 21.5) for a man is . But if that man has a slender build (small frame), he may be overweight at and should reduce by 10% to roughly (BMI 19.4). In the reverse, the man with a larger frame and more solid build should increase by 10%, to roughly (BMI 23.7). If one teeters on the edge of small/medium or medium/large, common sense should be used in calculating one's ideal weight. However, falling into one's ideal weight range for height and build is still not as accurate in determining health risk factors as waist-to-height ratio and actual body fat percentage. Accurate frame size calculators use several measurements (wrist circumference, elbow width, neck circumference, and others) to determine what category an individual falls into for a given height. The BMI also fails to take into account loss of height through ageing. In this situation, BMI will increase without any corresponding increase in weight. Muscle versus fat. Assumptions about the distribution between muscle mass and fat mass are inexact. BMI generally overestimates adiposity on those with leaner body mass (e.g., athletes) and underestimates excess adiposity on those with fattier body mass. A study in June 2008 by Romero-Corral et al. examined 13,601 subjects from the United States' third National Health and Nutrition Examination Survey (NHANES III) and found that BMI-defined obesity (BMI ≥ 30) was present in 21% of men and 31% of women. Body fat-defined obesity was found in 50% of men and 62% of women. While BMI-defined obesity showed high specificity (95% for men and 99% for women), BMI showed poor sensitivity (36% for men and 49% for women). In other words, the BMI will be mostly correct when determining a person to be obese, but can err quite frequently when determining a person not to be. Despite this undercounting of obesity by BMI, BMI values in the intermediate BMI range of 20–30 were found to be associated with a wide range of body fat percentages. For men with a BMI of 25, about 20% have a body fat percentage below 20% and about 10% have body fat percentage above 30%. Body composition for athletes is often better calculated using measures of body fat, as determined by such techniques as skinfold measurements or underwater weighing and the limitations of manual measurement have also led to alternative methods to measure obesity, such as the body volume indicator. Variation in definitions of categories. It is not clear where on the BMI scale the threshold for "overweight" and "obese" should be set. Because of this, the standards have varied over the past few decades. Between 1980 and 2000 the U.S. Dietary Guidelines have defined overweight at a variety of levels ranging from a BMI of 24.9 to 27.1. In 1985, the National Institutes of Health (NIH) consensus conference recommended that overweight BMI be set at a BMI of 27.8 for men and 27.3 for women. In 1998, an NIH report concluded that a BMI over 25 is overweight and a BMI over 30 is obese. In the 1990s the World Health Organization (WHO) decided that a BMI of 25 to 30 should be considered overweight and a BMI over 30 is obese, the standards the NIH set. This became the definitive guide for determining if someone is overweight. One study found that the vast majority of people labelled 'overweight' and 'obese' according to current definitions do not in fact face any meaningful increased risk for early death. In a quantitative analysis of several studies, involving more than 600,000 men and women, the lowest mortality rates were found for people with BMIs between 23 and 29; most of the 25–30 range considered 'overweight' was not associated with higher risk. Alternatives. Corpulence index (exponent of 3). The corpulence index uses an exponent of 3 rather than 2. The corpulence index yields valid results even for very short and very tall people, which is a problem with BMI. For example, a tall person at an ideal body weight of gives a normal BMI of 20.74 and CI of 13.6, while a tall person with a weight of gives a BMI of 24.84, very close to an overweight BMI of 25, and a CI of 12.4, very close to a normal CI of 12. New BMI (exponent of 2.5). A study found that the best exponent E for predicting the fat percent would be between 2 and 2.5 in formula_2. An exponent of 5/2 or 2.5 was proposed by Quetelet in the 19th century: In general, we do not err much when we assume that during development the squares of the weight at different ages are as the fifth powers of the height This exponent of 2.5 is used in a revised formula for Body Mass Index, proposed by Nick Trefethen, Professor of numerical analysis at the University of Oxford, which minimizes the distortions for shorter and taller individuals resulting from the use of an exponent of 2 in the traditional BMI formula: formula_3 The scaling factor of 1.3 was determined to make the proposed new BMI formula align with the traditional BMI formula for adults of average height, while the exponent of 2.5 is a compromise between the exponent of 2 in the traditional formula for BMI and the exponent of 3 that would be expected for the scaling of weight (which at constant density would theoretically scale with volume, i.e., as the cube of the height) with height. In Trefethen's analysis, an exponent of 2.5 was found to fit empirical data more closely with less distortion than either an exponent of 2 or 3. BMI prime (exponent of 2, normalization factor). BMI Prime, a modification of the BMI system, is the ratio of actual BMI to upper limit optimal BMI (currently defined at 25 kg/m2), i.e., the actual BMI expressed as a proportion of upper limit optimal. BMI Prime is a dimensionless number independent of units. Individuals with BMI Prime less than 0.74 are underweight; those with between 0.74 and 1.00 have optimal weight; and those at 1.00 or greater are overweight. BMI Prime is useful clinically because it shows by what ratio (e.g. 1.36) or percentage (e.g. 136%, or 36% above) a person deviates from the maximum optimal BMI. For instance, a person with BMI 34 kg/m2 has a BMI Prime of 34/25 = 1.36, and is 36% over their upper mass limit. In South East Asian and South Chinese populations (see § international variations), BMI Prime should be calculated using an upper limit BMI of 23 in the denominator instead of 25. BMI Prime allows easy comparison between populations whose upper-limit optimal BMI values differ. Waist circumference. Waist circumference is a good indicator of visceral fat, which poses more health risks than fat elsewhere. According to the U.S. National Institutes of Health (NIH), waist circumference in excess of for men and for (non-pregnant) women is considered to imply a high risk for type 2 diabetes, dyslipidemia, hypertension, and cardiovascular disease CVD. Waist circumference can be a better indicator of obesity-related disease risk than BMI. For example, this is the case in populations of Asian descent and older people. for men and for women has been stated to pose "higher risk", with the NIH figures "even higher". Waist-to-hip circumference ratio has also been used, but has been found to be no better than waist circumference alone, and more complicated to measure. A related indicator is waist circumference divided by height. A 2013 study identified critical threshold values for waist-to-height ratio according to age, with consequent significant reduction in life expectancy if exceeded. These are: 0.5 for people under 40 years of age, 0.5 to 0.6 for people aged 40–50, and 0.6 for people over 50 years of age. Surface-based body shape index. The Surface-based Body Shape Index (SBSI) is far more rigorous and is based upon four key measurements: the body surface area (BSA), vertical trunk circumference (VTC), waist circumference (WC) and height (H). Data on 11,808 subjects from the National Health and Human Nutrition Examination Surveys (NHANES) 1999–2004, showed that SBSI outperformed BMI, waist circumference, and A Body Shape Index (ABSI), an alternative to BMI. formula_4 A simplified, dimensionless form of SBSI, known as SBSI*, has also been developed. formula_5 Modified body mass index. Within some medical contexts, such as familial amyloid polyneuropathy, serum albumin is factored in to produce a modified body mass index (mBMI). The mBMI can be obtained by multiplying the BMI by serum albumin, in grams per litre.
4789
25170845
https://en.wikipedia.org/wiki?curid=4789
Behistun Inscription
The Behistun Inscription (also Bisotun, Bisitun or Bisutun; , Old Persian: Bagastana, meaning "the place of god") is a multilingual Achaemenid royal inscription and large rock relief on a cliff at Mount Behistun in the Kermanshah Province of Iran, near the city of Kermanshah in western Iran, established by Darius the Great (). It was important to the decipherment of cuneiform, as it is the longest known trilingual cuneiform inscription, written in Old Persian, Elamite, and Babylonian (a variety of Akkadian). Authored by Darius the Great sometime between his coronation as king of the Persian Empire in the summer of 522 BC and his death in autumn of 486 BC, the inscription begins with a brief autobiography of Darius, including his ancestry and lineage. Later in the inscription, Darius provides a lengthy sequence of events following the death of Cambyses II in which he fought nineteen battles in a period of one year (ending in December 521 BC) to put down multiple rebellions throughout the Persian Empire. The inscription states in detail that the rebellions were orchestrated by several impostors and their co-conspirators in various cities throughout the empire, each of whom falsely proclaimed himself king during the upheaval following Cambyses II's death. Darius the Great proclaimed himself victorious in all battles during the period of upheaval, attributing his success to the "grace of Ahura Mazda". The inscription is approximately high by wide and up a limestone cliff from an ancient road connecting the capitals of Babylonia and Media (Babylon and Ecbatana, respectively). The Old Persian text contains 414 lines in five columns; the Elamite text includes 260 lines in eight columns, and the Babylonian text is in 112 lines. A copy of the text in Aramaic, written during the reign of Darius II, was found in Egypt. The inscription was illustrated by a life-sized bas-relief of Darius I, the Great, holding a bow as a sign of kingship, with his left foot on the chest of a figure lying supine before him. The supine figure is reputed to be the pretender Gaumata. Darius is attended to the left by two servants, and nine one-meter figures stand to the right, with hands tied and rope around their necks, representing conquered peoples. A Faravahar floats above, giving its blessing to the king. One figure appears to have been added after the others were completed, as was Darius's beard, which is a separate block of stone attached with iron pins and lead. Name. The name Behistun is derived from usage in Ancient Greek and Arabic sources, particularly Diodorus Siculus and Ya'qubi, transliterated into English in the 19th century by Henry Rawlinson. The modern Persian version name is Bisotun. History. After the fall of the Persian Empire's Achaemenid Dynasty and its successors, and the lapse of Old Persian cuneiform writing into disuse, the nature of the inscription was forgotten, and fanciful explanations became the norm. In 1598, Englishman Robert Sherley saw the inscription during a diplomatic mission to Safavid Persia on behalf of Austria, and brought it to the attention of Western European scholars. His party incorrectly came to the conclusion that it was Christian in origin. French General Gardanne thought it showed "Christ and his twelve apostles", and Sir Robert Ker Porter thought it represented the Lost Tribes of Israel and Shalmaneser of Assyria. In 1604, Italian explorer Pietro della Valle visited the inscription and made preliminary drawings of the monument. Translation efforts. German surveyor Carsten Niebuhr visited in around 1764 for Frederick V of Denmark, publishing a copy of the inscription in the account of his journeys in 1778. Niebuhr's transcriptions were used by Georg Friedrich Grotefend and others in their efforts to decipher the Old Persian cuneiform script. Grotefend had deciphered ten of the 37 symbols of Old Persian by 1802, after realizing that unlike the Semitic cuneiform scripts, Old Persian text is alphabetic and each word is separated by a vertical slanted symbol. In 1835, Sir Henry Rawlinson, an officer of the British East India Company army assigned to the forces of the Shah of Iran, began studying the inscription in earnest. As the town of Bisotun's name was anglicized as "Behistun" at this time, the monument became known as the "Behistun Inscription". Despite its relative inaccessibility, Rawlinson was able to scale the cliff with the help of a local boy and copy the Old Persian inscription. The Elamite was across a chasm, and the Babylonian four meters above; both were beyond easy reach and were left for later. In 1847, he was able to send a full and accurate copy to Europe. Later research and activity. The site was visited by the American linguist A. V. Williams Jackson in 1903. Later expeditions, in 1904 sponsored by the British Museum and led by Leonard William King and Reginald Campbell Thompson and in 1948 by George G. Cameron of the University of Michigan, obtained photographs, casts and more accurate transcriptions of the texts, including passages that were not copied by Rawlinson. It also became apparent that rainwater had dissolved some areas of the limestone in which the text was inscribed, while leaving new deposits of limestone over other areas, covering the text. In 1938, the inscription became of interest to the Nazi German think tank Ahnenerbe, although research plans were cancelled due to the onset of World War II. The monument later suffered some damage from Allied soldiers using it for target practice in World War II, and during the Anglo-Soviet invasion of Iran. In 1999, Iranian archeologists began the documentation and assessment of damages to the site incurred during the 20th century. Malieh Mehdiabadi, who was project manager for the effort, described a photogrammetric process by which two-dimensional photos were taken of the inscriptions using two cameras and later transmuted into 3-D images. In recent years, Iranian archaeologists have been undertaking conservation works. The site became a UNESCO World Heritage Site in 2006. In 2012, the Bisotun Cultural Heritage Center organized an international effort to re-examine the inscription. Content. Lineage. In the first section of the inscription, Darius the Great declares his ancestry and lineage: Territories. Darius also lists the territories under his rule: Conflicts and revolts. Later in the inscription, Darius provides an eye-witness account of battles he successfully fought over a one-year period to put down rebellions which had resulted from the deaths of Cyrus the Great, and his son Cambyses II: Other historical monuments in the Behistun complex. The site covers an area of 116 hectares. Archeological evidence indicates that this region became a human shelter 40,000 years ago. There are 18 historical monuments other than the inscription of Darius the Great in the Behistun complex that have been registered in the Iranian national list of historical sites. Some of them are: Similar reliefs and inspiration. The Anubanini rock relief, also called Sarpol-i Zohab, of the Lullubi king Anubanini, dated to , and which is located not far from the Behistun reliefs at Sarpol-e Zahab, is very similar to the reliefs at Behistun. The attitude of the ruler, the trampling of an enemy, the lines of prisoners are all very similar, to such extent that it was said that the sculptors of the Behistun Inscription had probably seen the Anubanini relief beforehand and were inspired by it. The Lullubian reliefs were the model for the Behistun reliefs of Darius the Great. The inscriptional tradition of the Achaemenids, starting especially with Darius I, is thought to have derived from the traditions of Elam, Lullubi, the Babylonians and the Assyrians.
4792
28537347
https://en.wikipedia.org/wiki?curid=4792
Barry Goldwater
Barry Morris Goldwater (January 2, 1909 – May 29, 1998) was an American politician and major general in the Air Force Reserve who served as a United States senator from 1953 to 1965 and 1969 to 1987, and was the Republican Party's nominee for president in 1964. Goldwater was born in Phoenix, Arizona, where he helped manage his family's department store. During World War II, he flew aircraft between the U.S. and India. After the war, Goldwater served in the Phoenix City Council. In 1952, he was elected to the U.S. Senate, where he rejected the legacy of the New Deal and, along with the conservative coalition, fought against the New Deal coalition. Goldwater also challenged his party's moderate to liberal wing on policy issues. He supported the Civil Rights Acts of 1957 and 1960 and the 24th Amendment to the U.S. Constitution but opposed the Civil Rights Act of 1964, disagreeing with Title II and Title VII. In the 1964 U.S. presidential election, Goldwater mobilized a large conservative constituency to win the Republican nomination, but then lost the general election to incumbent Democratic president Lyndon B. Johnson in a landslide. Goldwater returned to the Senate in 1969 and specialized in defense and foreign policy. He successfully urged president Richard Nixon to resign in 1974 when evidence of a cover-up in the Watergate scandal became overwhelming and impeachment was imminent. In 1986, he oversaw passage of the Goldwater–Nichols Act, which strengthened civilian authority in the U.S. Department of Defense. Near the end of his career, Goldwater's views on social and cultural issues grew increasingly libertarian. Many political pundits and historians believe he laid the foundation for the conservative revolution to follow as the grassroots organization and conservative takeover of the Republican Party began a long-term realignment in American politics, which helped to bring about the presidency of Ronald Reagan in the 1980s. He also had a substantial impact on the American libertarian movement. After leaving the Senate, Goldwater became supportive of environmental protection, gay rights, including military service and adoption rights for same-sex couples, abortion rights, and the legalization of marijuana. Early life and education. Goldwater was born on January 2, 1909, in Phoenix, Arizona, in what was then the Arizona Territory, the son of Baron M. Goldwater and his wife, Hattie Josephine "JoJo" Williams. Goldwater long believed that he was born on January 1, 1909, and thus works published during his career list this as his date of birth; however, in his later years, he discovered documentation revealing that he was actually born at 3 a.m. on January 2. His father's family founded Goldwater's Department Store, a leading upscale department store in Phoenix. Goldwater's paternal grandfather, Michel Goldwasser, a Polish Jew, was born in 1821 in Konin, then part of Congress Poland. He emigrated to London following the Revolutions of 1848. Soon after arriving in London, Michel anglicized his name to Michael Goldwater. Michel married Sarah Nathan, a member of an English-Jewish family, in the Great Synagogue of London. The Goldwaters later emigrated to the United States, first arriving in San Francisco, California, before finally settling in the Arizona Territory, where Michael Goldwater opened a small department store that was later taken over and expanded by his three sons, Henry, Baron, and Morris. Morris Goldwater (1852–1939) was an Arizona territorial and state legislator, mayor of Prescott, Arizona, delegate to the Arizona Constitutional Convention, and later President of the Arizona State Senate. Goldwater's father was Jewish, but Goldwater was raised in his mother's Episcopal faith. Hattie Williams came from an established New England family that included the theologian Roger Williams, of Rhode Island. Goldwater's parents were married in an Episcopal church in Phoenix; for his entire life, Goldwater was an Episcopalian, though on rare occasions he referred to himself as Jewish. While he did not often attend church, he stated that "If a man acts in a religious way, an ethical way, then he's really a religious man—and it doesn't have a lot to do with how often he gets inside a church." His first cousin was Julius Goldwater, a convert to Buddhism and Jodo Shinshu priest who assisted interned Japanese Americans during World War II. After performing poorly academically as a high school freshman, Goldwater's parents sent him to Staunton Military Academy in Staunton, Virginia, where he played varsity football, basketball, track, and swimming; was senior class treasurer; and attained the rank of captain. He graduated from the academy in 1928 and enrolled at the University of Arizona, but dropped out after one year. Barry Goldwater is the most recent non-college graduate to be the nominee of a major political party in a presidential election. Goldwater entered the family's business around the time of his father's death, in 1930. Six years later, he took over the department store, though he was not particularly enthused about running the business. Career. U.S. Air Force. After the United States entered World War II, Goldwater received a reserve commission in the United States Army Air Force. Goldwater trained as a pilot and was assigned to the Ferry Command, a newly formed unit that flew aircraft and supplies to war zones worldwide. He spent most of the war flying between the U.S. and India, via the Azores and North Africa or South America, Nigeria, and Central Africa. Goldwater also flew "the hump", one of the most dangerous routes for supply planes during WWII. The route required aircraft to fly directly over the Himalayas in order to deliver desperately needed supplies to the Republic of China. Following the end of World War II in 1945, Goldwater was a leading proponent of creating the United States Air Force Academy and later served on the academy's Board of Visitors. The visitor center at the academy is now named in his honor. Goldwater remained in the Army Air Reserve after the war, and in 1946, at the rank of Colonel, Goldwater founded the Arizona Air National Guard. Goldwater ordered the Arizona Air National Guard desegregated, two years before the rest of the U.S. military. In the early 1960s, while a senator, he commanded the 9999th Air Reserve Squadron as a major general. Goldwater was instrumental in pushing the Pentagon to support the desegregation of the armed services. Goldwater remained in the Arizona Air National Guard until 1967, retiring as a Command Pilot with the rank of major general. As a U.S. Senator, Goldwater had a sign in his office that referenced his military career and mindset: "There are old pilots and there are bold pilots, but there are no old, bold pilots." Early political involvement. In a heavily Democratic state, Goldwater became a conservative Republican and a friend of Herbert Hoover. He was outspoken against New Deal liberalism, especially its close ties to labor unions. A pilot, amateur radio operator, outdoorsman, and photographer, he criss-crossed Arizona and developed a deep interest in both the natural and the human history of the state. He entered Phoenix politics in 1949, when he was elected to the City Council as part of a nonpartisan team of candidates pledged to clean up widespread prostitution and gambling. The team won every mayoral and council election for the next two decades. Goldwater rebuilt the weak Republican Party, and was instrumental in electing Howard Pyle as Governor in 1950. Support for civil rights. Goldwater was a supporter of racial equality. He integrated his family's business upon taking over control in the 1930s. A lifetime member of the NAACP, Goldwater helped found the group's Arizona chapter. He saw to it that the Arizona Air National Guard was racially integrated from its inception in 1946, two years before President Truman ordered the military as a whole be integrated (a process that was not completed until 1954). Goldwater worked with Phoenix civil rights leaders to successfully integrate public schools a year prior to "Brown v. Board of Education". Despite this support of civil rights, he remained in objection to some major federal civil rights legislation. Civil rights leaders like Martin Luther King Jr. remarked of him, "while not himself a racist, Mr. Goldwater articulates a philosophy which gives aid and comfort to the racists." Goldwater was an early member and largely unrecognized supporter of the National Urban League's Phoenix chapter, going so far as to cover the group's early operating deficits with his personal funds. Though the NAACP denounced Goldwater in the harshest of terms when he ran for president, the Urban League conferred on him the 1991 Humanitarian Award "for 50 years of loyal service to the Phoenix Urban League". In response to League members who objected, citing Goldwater's vote on the Civil Rights Act of 1964, the League president pointed out that he had saved the League more than once, saying he preferred to judge a person "on the basis of his daily actions rather than on his voting record". U.S. Senator. Running as a Republican, Goldwater won a narrow upset victory seat in the 1952 Arizona Senate election against veteran Democrat and Senate Majority Leader Ernest McFarland. He won largely by defeating McFarland in his native Maricopa County by 12,600 votes, almost double the overall margin of 6,725 votes. Goldwater defeated McFarland by a larger margin when he ran again in 1958. Following his strong re-election showing, he became the first Arizona Republican to win a second term in the U.S. Senate. Goldwater's victory was all the more remarkable since it came in a year Democrats gained 13 seats in the Senate. During his Senate career, Goldwater was regarded as the "Grand Old Man of the Republican Party and one of the nation's most respected exponents of conservatism". Criticism of Eisenhower administration. Goldwater was outspoken about the Eisenhower administration, calling some of the policies of the administration too liberal for a Republican president. "Democrats delighted in pointing out that the junior senator was so headstrong that he had gone out his way to criticize the president of his own party." There was a Democratic majority in Congress for most of Eisenhower's career, and Goldwater felt that President Dwight Eisenhower was compromising too much with Democrats in order to get legislation passed. Early on in his career as a senator for Arizona, he criticized the $71.8 billion budget that President Eisenhower sent to Congress, stating, "Now, however, I am not so sure. A $71.8 billion budget not only shocks me, but it weakens my faith." Goldwater opposed Eisenhower's pick of Earl Warren for Chief Justice of the United States. "The day that Eisenhower appointed Governor Earl Warren of California as Chief Justice of the Supreme Court, Goldwater did not hesitate to express his misgivings." However, Goldwater was present in the United States Senate on March 1, 1954, when Warren was unanimously confirmed, voted in favor of Eisenhower's nomination of John Marshall Harlan II on March 16, 1955, was present for the unanimous nominations of William J. Brennan Jr. and Charles Evans Whittaker on March 19, 1957, and voted in favor of the nomination of Potter Stewart on May 5, 1959. Stance on civil rights. In his first year in the Senate, Goldwater was responsible for the desegregation of the Senate cafeteria after he insisted that his Black legislative assistant, Katherine Maxwell, be served along with every other Senate employee. Goldwater and the Eisenhower administration supported the integration of schools in the South, but Goldwater felt the states should choose how they wanted to integrate and should not be forced by the federal government. "Goldwater criticized the use of federal troops. He accused the Eisenhower administration of violating the Constitution by assuming powers reserved by the states. While he agreed that under the law, every state should have integrated its schools, each state should integrate in its own way." There were high-ranking government officials following Goldwater's critical stance on the Eisenhower administration, even an Army General. "Fulbright's startling revelation that military personnel were being indoctrinated with the idea that the policies of the Commander in Chief were treasonous dovetailed with the return to the news of the strange case of General Edwin Walker." In his 1960 book "The Conscience of a Conservative", Goldwater stated that he supported the stated objectives of the Supreme Court's decision in "Brown v. Board of Education", but argued that the federal government had no role in ordering states to desegregate public schools. He wrote:"I believe that it "is" both wise and just for negro children to attend the same schools as whites, and that to deny them this opportunity carries with it strong implications of inferiority. I am not prepared, however, to impose that judgement of mine on the people of Mississippi or South Carolina, or to tell them what methods should be adopted and what pace should be kept in striving toward that goal. That is their business, not mine. I believe that the problem of race relations, like all social and cultural problems, is best handled by the people directly concerned. Social and cultural change, however desirable, should not be effected by the engines of national power."Goldwater voted in favor of both the Civil Rights Act of 1957 and the 24th Amendment to the U.S. Constitution, but did not vote on the Civil Rights Act of 1960 because he was absent from the chamber while Senate Minority Whip Thomas Kuchel (R–CA) announced that Goldwater would have voted in favor if present. While he did vote in favor of it while in committee, Goldwater reluctantly voted against the Civil Rights Act of 1964 when it came to the floor. Later, Goldwater would state that he was mostly in support of the bill, but he disagreed with Titles II and VII, which both dealt with employment, making him imply that the law would end in the government dictating hiring and firing policy for millions of Americans. Congressional Republicans overwhelmingly supported the bill, with Goldwater being joined by only five other Republican senators in voting against it. It is likely that Goldwater significantly underestimated the effect this would have, as his vote against the bill hurt him with voters across the country, including from his own party. In the 1990s, Goldwater would call his vote on the Civil Rights Act "one of his greatest regrets." Goldwater was absent from the Senate during President John F. Kennedy's nomination of Byron White to Supreme Court on April 11, 1962, but was present when Arthur Goldberg was unanimously confirmed. 1964 presidential election. Goldwater's direct style had made him extremely popular with the Republican Party's suburban conservative voters, based in the South and the senator's native West. Following the success of "The Conscience of a Conservative", Goldwater became the frontrunner for the GOP Presidential nomination to run against John F. Kennedy. Despite their disagreements on politics, Goldwater and Kennedy had grown to become close friends during the eight years they served alongside each other in the Senate. With Goldwater the clear GOP frontrunner, he and Kennedy began planning to campaign together, holding Lincoln-Douglas style debates across the country and avoiding a race defined by the kind of negative attacks that were increasingly coming to define American politics. Republican primary. Goldwater was grief-stricken by the assassination of Kennedy and was greatly disappointed that his opponent in 1964 would not be Kennedy but instead his vice president, former Senate Majority Leader Lyndon B. Johnson of Texas. Goldwater disliked Johnson, later telling columnist John Kolbe that Johnson had "used every dirty trick in the bag." At the time of Goldwater's presidential candidacy, the Republican Party was split between its conservative wing (based in the West and South) and moderate/liberal wing, sometimes called Rockefeller Republicans (based in the Northeast and Midwest). Goldwater alarmed even some of his fellow partisans with his brand of staunch fiscal conservatism and militant anti-communism. He was viewed by many moderate and liberal Republicans as being too far on the right wing of the political spectrum to appeal to the mainstream majority necessary to win a national election. As a result, moderate and liberal Republicans recruited a series of opponents, including New York Governor Nelson Rockefeller, Henry Cabot Lodge Jr., of Massachusetts and Pennsylvania Governor William Scranton, to challenge him. Goldwater received solid backing from most of the few Southern Republicans then in politics. A young Birmingham lawyer, John Grenier, secured commitments from 271 of 279 Southern convention delegates to back Goldwater. Grenier would serve as executive director of the national GOP during the Goldwater campaign, the number two position to party chairman Dean Burch of Arizona. Goldwater fought and won a multi-candidate race for the Republican Party's presidential nomination. 1964 Republican National Convention. Eisenhower gave his support to Goldwater when he told reporters, "I personally believe that Goldwater is not an extremist as some people have made him, but in any event we're all Republicans." His nomination was staunchly opposed by the so-called Liberal Republicans, who thought Goldwater's demand for active measures to defeat the Soviet Union would foment a nuclear war. In addition to Rockefeller, prominent Republican office-holders refused to endorse Goldwater's candidacy, including both Republican senators from New York Kenneth B. Keating and Jacob Javits, Pennsylvania governor William Scranton, Michigan governor George Romney and Congressman John V. Lindsay (NY-17). Rockefeller Republican Jackie Robinson walked out of the convention in disgust over Goldwater's nomination. Henry Cabot Lodge Jr., who was Richard Nixon's running mate in 1960, also opposed Goldwater, calling his proposal of realigning the Democrat and Republican parties into two Liberal and Conservative parties "totally abhorrent" and thought that no one in their right mind should oppose the federal government in having a role in the future of America. In the face of such opposition, Goldwater delivered a well-received acceptance speech. According to the author Lee Edwards: "[Goldwater] devoted more care [to it] than to any other speech in his political career. And with good reason: he would deliver it to the largest and most attentive audience of his life." Journalist John Adams commented: "his acceptance speech was bold, reflecting his conservative views, but not irrational. Rather than shrinking from those critics who accuse him of extremism, Goldwater challenged them head-on" in his acceptance speech at the 1964 Republican Convention. In his own words: His paraphrase of Cicero was included at the suggestion of Harry V. Jaffa, though the speech was primarily written by Karl Hess. Because of President Johnson's popularity, Goldwater refrained from attacking the president directly. He did not mention Johnson by name at all in his convention speech. Although raised as an Episcopalian, Goldwater was the first candidate of Jewish descent, through his father, to be nominated for president by a major American party. 1964 general presidential campaign. After securing the Republican presidential nomination, Goldwater chose his political ally, RNC Chairman William E. Miller, to be his running mate. Goldwater joked he chose Miller because "he drives Johnson nuts". In choosing Miller, Goldwater opted for a running mate who was ideologically aligned with his own conservative wing of the Republican party. Miller balanced the ticket in other ways, being a practicing Catholic from the East Coast. Miller had low name recognition but was popular in the Republican party and viewed as a skilled political strategist. Former U.S. senator Prescott Bush, a moderate Republican from Connecticut, was a friend of Goldwater and supported him in the general election campaign. Future chief justice of the United States and fellow Arizonan William H. Rehnquist also first came to the attention of national Republicans through his work as a legal adviser to Goldwater's presidential campaign. Rehnquist had begun his law practice in 1953 in the firm of Denison Kitchel of Phoenix, Goldwater's national campaign manager and friend of nearly three decades. Goldwater's advocacy of active interventionism to prevent the spread of communism and defend American values and allies led to effective counterattacks from Lyndon B. Johnson and his supporters, who said that Goldwater's militancy would have dire consequences, possibly even nuclear war. In a May 1964 speech, Goldwater suggested that nuclear weapons should be treated more like conventional weapons and used in Vietnam, specifically that they should have been used at Dien Bien Phu in 1954 to defoliate trees. Regarding Vietnam, Goldwater charged that Johnson's policy was devoid of "goal, course, or purpose," leaving "only sudden death in the jungles and the slow strangulation of freedom". Goldwater's rhetoric on nuclear war was viewed by many as quite uncompromising, a view buttressed by off-hand comments such as, "Let's lob one into the men's room at the Kremlin." He also advocated that field commanders in Vietnam and Europe should be given the authority to use tactical nuclear weapons (which he called "small conventional nuclear weapons") without presidential confirmation. Goldwater countered the Johnson attacks by criticizing the administration for its perceived ethical lapses, and stating in a commercial that "we, as a nation, are not far from the kind of moral decay that has brought on the fall of other nations and people... I say it is time to put conscience back in government. And by good example, put it back in all walks of American life." Goldwater campaign commercials included statements of support by actor Raymond Massey and moderate Republican senator Margaret Chase Smith. Before the 1964 election, "Fact" magazine, published by Ralph Ginzburg, ran a special issue titled, "The Unconscious of a Conservative: A Special Issue on the Mind of Barry Goldwater". The two main articles contended that Goldwater was mentally unfit to be president. The magazine supported this claim with the results of a poll of board-certified psychiatrists. "Fact" had mailed questionnaires to 12,356 psychiatrists, receiving responses from 2,417, of whom 1,189 said Goldwater was mentally incapable of holding the office of president. Most of the other respondents declined to diagnose Goldwater because they had not clinically interviewed him but said that, although not psychologically unfit to preside, Goldwater would be negligent in the role. After the election, Goldwater sued the publisher, the editor and the magazine for libel in "Goldwater v. Ginzburg". "Although the jury awarded Goldwater only $1.00 in compensatory damages against all three defendants, it went on to award him punitive damages of $25,000 against Ginzburg and $50,000 against "Fact" magazine, Inc." According to Warren Boroson, then-managing editor of "Fact" and later a financial columnist, the main biography of Goldwater in the magazine was written by David Bar-Illan, the Israeli pianist. Political advertising. A Democratic campaign advertisement known as Daisy showed a young girl counting daisy petals, from one to ten. Immediately following this scene, a voiceover counted down from ten to one. The child's face was shown as a still photograph followed by images of nuclear explosions and mushroom clouds. The campaign advertisement ended with a plea to vote for Johnson, implying that Goldwater (though not mentioned by name) would provoke a nuclear war if elected. The advertisement, which featured only a few spoken words and relied on imagery for its emotional impact, was one of the most provocative in American political campaign history, and many analysts credit it as being the birth of the modern style of "negative political ads" on television. The ad aired only once and was immediately pulled, but it was then shown many times by local television stations covering the controversy. Goldwater did not have ties to the Ku Klux Klan (KKK), but he was publicly endorsed by members of the organization. Lyndon B. Johnson exploited this association during the elections, but Goldwater barred the KKK from supporting him and denounced them. Throughout the presidential campaign, Goldwater refused to appeal to racial tensions or backlash against civil rights. After the outbreak of the Harlem riot of 1964, Goldwater privately gathered news reporters on his campaign plane and said that if anyone attempted to sow racial violence on his political behalf, he would withdraw from the presidential raceeven if it was the day before the election. Past comments came back to haunt Goldwater throughout the campaign. He had once called the Eisenhower administration "a dime-store New Deal", and the former president never fully forgave him. However, Eisenhower did film a television commercial with Goldwater. Eisenhower qualified his voting for Goldwater in November by remarking that he had voted not specifically for Goldwater, but for the Republican Party. In December 1961, Goldwater had told a news conference that "sometimes I think this country would be better off if we could just saw off the Eastern Seaboard and let it float out to sea." That comment boomeranged on him during the campaign in the form of a Johnson television commercial, as did remarks about making Social Security voluntary, and statements in Tennessee about selling the Tennessee Valley Authority, a large local New Deal employer. The Goldwater campaign spotlighted Ronald Reagan, who appeared in a campaign ad. In turn, Reagan gave a stirring, nationally televised speech, "A Time for Choosing", in support of Goldwater. Results. Goldwater only won his home state of Arizona and five states in the Deep South. The Southern states, traditionally Democratic up to that time, voted Republican primarily as a statement of opposition to the Civil Rights Act, which had been signed into law by Johnson earlier that year. Despite Johnson's support for the Civil Rights Act, the bill received split support from Congressional Democrats due to southerner opposition. In contrast, Congressional Republicans overwhelmingly supported the bill, with Goldwater being joined by only 5 other Republican senators in voting against it. In the end, Goldwater received 38% of the popular vote and carried just six states: Arizona (with 51% of the popular vote) and the core states of the Deep South: Alabama, Georgia, Louisiana, Mississippi, and South Carolina. In carrying Georgia by a margin of 54–45%, Goldwater became the first Republican nominee to win the state. Goldwater's poor showing pulled down many supporters. Of the 57 Republican Congressmen who endorsed Goldwater before the convention, 20 were defeated for reelection, along with many promising young Republicans. In contrast, Republican Congressman John Lindsay (NY-17), who refused to endorse Goldwater, was handily re-elected in a district where Democrats held a 10% overall advantage. On the other hand, the defeat of so many older politicians created openings for young conservatives to move up the ladder. While the loss of moderate Republicans was temporary—they were back by 1966—Goldwater also permanently pulled many conservative Southerners and whites out of the New Deal Coalition. According to Steve Kornacki of "Salon", "Goldwater broke through and won five [Southern] states—the best showing in the region for a GOP candidate since Reconstruction. In Mississippi—where Franklin D. Roosevelt had won nearly 100 percent of the vote 28 years earlier—Goldwater claimed a staggering 87 percent." It has frequently been argued that Goldwater's strong performance in Southern states previously regarded as Democratic strongholds foreshadowed a larger shift in electoral trends in the coming decades that would make the South a Republican bastion (an end to the "Solid South")—first in presidential politics and eventually at the congressional and state levels, as well. Also, Goldwater's uncompromising promotion of freedom was the start of a continuing shift in American politics from liberalism to a conservative economic philosophy. Return to U.S. Senate. Goldwater remained popular in Arizona, and in the 1968 Senate election he was elected to the seat of retiring Senator Carl Hayden. He was reelected in 1974 and 1980. Throughout the late 1970s, as the conservative wing under Ronald Reagan gained control of the Republican Party, Goldwater concentrated on his Senate duties, especially in military affairs. Goldwater purportedly did not like Richard Nixon on either a political or personal level, later calling the California Republican "the most dishonest individual I have ever met in my life". Accordingly, he played little part in Nixon's election or administration, but he helped force Nixon's resignation in 1974. At the height of the Watergate scandal, Goldwater met with Nixon at the White House and urged him to resign. At the time, Nixon's impeachment by the House of Representatives was imminent and Goldwater warned him that fewer than 10 Republican senators would vote against conviction. Despite being a difficult year for Republicans candidates, the 1974 election saw Goldwater easily reelected over his Democratic opponent, Jonathan Marshall, the publisher of "The Scottsdale Progress". At the 1976 Republican National Convention, Goldwater helped block Nelson Rockefeller's renomination as vice president. When Reagan challenged Gerald Ford for the presidential nomination in 1976, Goldwater endorsed the incumbent Ford, looking for consensus rather than conservative idealism. As one historian notes, "The Arizonan had lost much of his zest for battle." In 1979, when President Carter normalized relations with Communist China, Goldwater and some other Senators sued him in the Supreme Court, arguing that the President could not terminate the Sino-American Mutual Defense Treaty with the Republic of China (Taiwan) without the approval of Congress. The case, "Goldwater v. Carter" (444 U.S. 996), was dismissed by the court as a political question. On June 9, 1969, Goldwater was absent during President Nixon's nomination of Warren E. Burger as Chief Justice of the United States while Senate Minority Whip Hugh Scott announced that Goldwater would have voted in favor if present. Goldwater voted in favor of Nixon's failed Supreme Court nomination of Clement Haynsworth on November 21, 1969, and a few months later, Goldwater voted in favor of Nixon's failed Supreme Court nomination of Harrold Carswell on April 8, 1970. The following month, Goldwater was absent when Nixon nominee Harry Blackmun was confirmed on May 12, 1970, while Senate Minority Whip Robert P. Griffin announced that Goldwater would have voted in favor if present. On December 6, 1971, Goldwater voted in favor of Nixon's nomination of Lewis F. Powell Jr., and on December 10, Goldwater voted in favor of Nixon's nomination of William Rehnquist as Associate Justice. On December 17, 1975, Goldwater voted in favor of President Gerald Ford's nomination of John Paul Stevens to the Supreme Court. Final campaign and U.S. Senate term. With his fourth Senate term due to end in January 1981, Goldwater seriously considered retiring from the Senate in 1980 before deciding to run for one final term. It was a surprisingly tough campaign for re-election. Goldwater was viewed by some as out of touch and vulnerable for several reasons, chiefly because he had planned to retire in 1981 and he had not visited many areas of Arizona outside of Phoenix and Tucson. Additionally, his Democratic challenger, Bill Schulz, proved to be a formidable opponent. A former Republican and a wealthy real estate developer, Schulz's campaign slogan was "Energy for the Eighties." Arizona's changing population also hurt Goldwater. The state's population had greatly increased, and a large portion of the electorate had not lived in the state at the time Goldwater was previously elected, meaning unlike most incumbents, many voters were less familiar with Goldwater's actual beliefs. Goldwater spent most of the campaign on the defensive. Although he was eventually declared as the winning candidate in the general election by a very narrow margin, receiving 49.5% of the vote to Schulz's 48.4%, early returns on election night indicated that Schulz would win. The counting of votes continued through the night and into the next morning. At around daybreak, Goldwater learned that he had been reelected thanks to absentee ballots, which were among the last to be counted. Goldwater's close victory in 1980 came despite Reagan's 61% landslide over Jimmy Carter in Arizona. Despite Goldwater's struggles, in 1980, Republicans were able to pick up 12 senate seats, regaining control of the chamber for the first time since 1955, when Goldwater was in his first term. Goldwater was now in the most powerful position he had ever been in the Senate. In October 1983, Goldwater voted against the legislation establishing Martin Luther King Jr. Day as a federal holiday. On September 21, 1981, Goldwater voted in favor of Reagan's Supreme Court nomination of Sandra Day O'Connor. Goldwater was absent during the nominations of William Rehnquist as Chief Justice of the United States and Antonin Scalia as Associate Justice on September 17, 1986. After the new Senate convened in January 1981, Goldwater became chairman of the Senate Intelligence Committee. In this role he clashed with the Reagan administration in April 1984 when he discovered that the Central Intelligence Agency (CIA) had been mining the waters of Nicaragua since February, something that he had first denied when the matter was raised. In a note to the CIA director William Casey, Goldwater denounced what he called an "act of war", saying that "this is no way to run a railroad" as he stated crossly that only Congress had the power to declare war and accused the CIA of illegally mining Nicaraguan waters without the permission of Congress. Goldwater concluded, "The President has asked us to back his foreign policy. Bill, how can we back his foreign policy when we don't know what the hell he is doing? Lebanon, yes, we all knew that he sent troops over there. But mine the harbors in Nicaragua? This is an act violating international law. It is an act of war. For the life of me, I don't see how we are going to explain it." Goldwater felt compelled to issue an apology on the floor of the Senate because the Senate Intelligence Committee had failed in its duties to oversee the CIA as he stated, saying, "I am forced to apologize for the members of my committee because I did not know the facts on this case. And I apologize to all the members of the Senate for the same reason". Goldwater subsequently voted for a Congressional resolution condemning the mining. In his 1980 Senate reelection campaign, Goldwater won support from religious conservatives but in his final term voted consistently to uphold legal abortion and in 1981 gave a speech on how he was angry about the bullying of American politicians by religious organizations and would "fight them every step of the way". He introduced the 1984 Cable Franchise Policy and Communications Act, which allowed local governments to require the transmission of public, educational, and government access (PEG) channels, barred cable operators from exercising editorial control over the content of programs carried on PEG channels and absolved them from liability for their content. On May 12, 1986, Goldwater was presented with the Presidential Medal of Freedom by President Reagan. In response to Moral Majority founder Jerry Falwell's opposition to the nomination of Sandra Day O'Connor to the Supreme Court, of which Falwell had said, "Every good Christian should be concerned", Goldwater retorted, "Every good Christian ought to kick Falwell right in the ass." According to John Dean, Goldwater actually suggested that good Christians ought to kick Falwell in the "nuts", but the news media "changed the anatomical reference". Goldwater also had harsh words for his one-time political protégé, President Reagan, particularly after the Iran–Contra Affair became public in 1986. Journalist Robert MacNeil, a friend of Goldwater's from the 1964 presidential campaign, recalled interviewing him in his office shortly afterward. "He was sitting in his office with his hands on his cane... and he said to me, 'Well, aren't you going to ask me about the Iran arms sales?' It had just been announced that the Reagan administration had sold arms to Iran. And I said, 'Well, if I asked you, what would you say?' He said, 'I'd say it's the god-damned stupidest foreign policy blunder this country's ever made! Aside from the Iran–Contra scandal, Goldwater thought nonetheless that Reagan was a good president. Retirement. Goldwater said later that the close result in 1980 convinced him not to run again. He retired in 1987, serving as Chair of the Senate Intelligence and Armed Services Committees in his final term. Despite his reputation as a firebrand in the 1960s, by the end of his career, he was considered a stabilizing influence in the Senate, one of the most respected members of either major party. Although Goldwater remained staunchly anti-communist and "hawkish" on military issues, he was a key supporter of the fight for ratification of the Panama Canal Treaty in the 1970s, which would give control of the canal zone to the Republic of Panama. His most important legislative achievement may have been the Goldwater–Nichols Act, which reorganized the U.S. military's senior-command structure. Policies. Goldwater became most associated with anti-union work and anti-communism; he was a supporter of the conservative coalition in Congress. His work on labor issues led Congress to pass major anti-labor reforms in 1957, and subsequently a campaign by the AFL–CIO to challenge his 1958 reelection bid. He voted against the censure of Senator Joseph McCarthy in 1954, who had been making unfounded claims about communists infiltrating the U.S. State Department during the Red Scare, but never actually accused any individual of being a communist or Soviet agent. Goldwater emphasized his strong opposition to the worldwide spread of communism in his 1960 book "The Conscience of a Conservative". The book became an important reference text in conservative political circles. In 1964, Goldwater ran a conservative campaign that emphasized states' rights. Goldwater's 1964 campaign was a magnet for conservatives since he opposed interference by the federal government in state affairs. Goldwater voted in favor of the Civil Rights Act of 1957 and the 24th Amendment to the U.S. Constitution, but did not vote on the Civil Rights Act of 1960 because he was absent from the chamber, with Senate Minority Whip Thomas Kuchel (R–CA) announcing that Goldwater would have voted in favor if present. Though Goldwater had supported the original Senate version of the bill, Goldwater voted against the Civil Rights Act of 1964. His public stance was based on his view that Article II and Article VII of the Act interfered with the rights of private persons to do or not to do business with whomever they chose and believed that the private employment provisions of the Act would lead to racial quotas. In the segregated city of Phoenix in the 1950s, he had quietly supported civil rights for blacks, but would not let his name be used. All this appealed to white Southern Democrats, and Goldwater was the first Republican to win the electoral votes of all of the Deep South states (South Carolina, Georgia, Alabama, Mississippi and Louisiana) since Reconstruction. However, Goldwater's vote on the Civil Rights Act proved devastating to his campaign everywhere outside the South (besides Dixie, Goldwater won only in Arizona, his home state), contributing to his landslide defeat in 1964. Goldwater's campaign also included stringently fiscally conservative policies. Goldwater was strongly critical of Johnson's War on Poverty policies and argued that it might be the "attitude or the actions" of the poor that are responsible for their hardship. In his prepared speech before the Economic Club of New York, Goldwater also claimed that arguing unemployment and poverty are caused by lack of education is "like saying that people have big feet because they wear big shoes. The fact is that most people who have no skill have no education for the same reason—low intelligence or low ambition." Goldwater also called for ending agricultural subsidies, privatizing Social Security, and privatizing the Tennessee Valley Authority. While Goldwater had been depicted by his opponents in the Republican primaries as a representative of a conservative philosophy that was extreme and alien, his voting records show that his positions were in generally aligned with those of other Republicans in the Congress. Goldwater fought in 1971 to stop U.S. funding of the United Nations after the People's Republic of China was admitted to the organization. He said: Goldwater and revival of American conservatism. Although Goldwater was not as important in the American conservative movement as Ronald Reagan after 1965, he shaped and redefined the movement from the late 1950s to 1964. Arizona Senator John McCain, who succeeded Goldwater in the Senate in 1987, said of Goldwater's legacy, "He transformed the Republican Party from an Eastern elitist organization to the breeding ground for the election of Ronald Reagan." Columnist George Will remarked that Reagan's victory in the 1980 presidential election was the metaphoric culmination of 16 years of counting the votes for Goldwater from the 1964 presidential race. The Republican Party recovered from the 1964 election debacle, acquiring 47 seats in the House of Representatives in the 1966 mid-term election. In January 1969, after Goldwater had been re-elected to the Senate, he wrote an article in the "National Review" "affirming that he [was] not against liberals, that liberals are needed as a counterweight to conservatism, and that he had in mind a fine liberal like Max Lerner." Goldwater was a strong supporter of environmental protection, saying in 1965: Later life. By the 1980s, with Ronald Reagan as president and the growing involvement of the religious right in conservative politics, Goldwater's libertarian views on personal issues were revealed; he believed that they were an integral part of true conservatism. Goldwater viewed abortion as a matter of personal choice and as such supported abortion rights. As a passionate defender of personal liberty, he saw the religious right's views as an encroachment on personal privacy and individual liberties. Although he voted against making Martin Luther King's birthday a national holiday in his last term as senator, Goldwater later expressed support for it. In 1987, he received the Langley Gold Medal from the Smithsonian Institution. In 1988, Princeton University's American Whig-Cliosophic Society awarded Goldwater the James Madison Award for Distinguished Public Service in recognition of his career. After his retirement in 1987, Goldwater described Arizona Governor Evan Mecham as "hardheaded" and called on him to resign, and two years later stated that the Republican party had been taken over by a "bunch of kooks". During the 1988 presidential campaign, he told vice-presidential nominee Dan Quayle at a campaign event in Arizona, "I want you to go back and tell George Bush to start talking about the issues." Some of Goldwater's statements in the 1990s alienated many social conservatives. He endorsed Democrat Karan English in an Arizona congressional race, urged Republicans to lay off Bill Clinton over the Whitewater scandal, and criticized the military's ban on homosexuals, saying, "Everyone knows that gays have served honorably in the military since at least the time of Julius Caesar", and, "You don't need to be 'straight' to fight and die for your country. You just need to shoot straight." A few years before his death, he addressed establishment Republicans by saying, "Do not associate my name with anything you do. You are extremists, and you've hurt the Republican party much more than the Democrats have." In a 1994 interview with "The Washington Post", Goldwater said: Also in November 1994, he repeated his concerns about religious groups attempting to gain control of the Republican party, saying, In 1996, he told Bob Dole, whose own presidential campaign received lukewarm support from conservative Republicans, "We're the new liberals of the Republican party. Can you imagine that?" In that same year, with Senator Dennis DeConcini, Goldwater endorsed an Arizona initiative to legalize medical marijuana against the countervailing opinion of social conservatives. Personal life. In 1934, Goldwater married Margaret "Peggy" Johnson, daughter of a prominent industrialist from Muncie, Indiana. The couple had four children: Joanne (born January 18, 1936), Barry (born July 15, 1938), Michael (born March 15, 1940), and Peggy (born July 27, 1944). Goldwater became a widower in 1985 and, in 1992, he married Susan Wechsler, a nurse 32 years his junior. Goldwater's son Barry Goldwater Jr. served as a Republican Congressman, representing California from 1969 to 1983. Goldwater's grandson, Ty Ross, is an interior designer and former Zoli model. Ross, who is openly gay and HIV positive, has been credited as inspiring the elder Goldwater "to become an octogenarian proponent of gay civil rights". Goldwater ran track and cross country in high school, where he specialized in the 880 yard run. In 1940, he became one of the first to run the Colorado River recreationally through the Grand Canyon, participating as an oarsman on Norman Nevills' second commercial river trip. Goldwater joined them in Green River, Utah, and rowed his own boat down to Lake Mead. In 1970, the Arizona Historical Foundation published the daily journal Goldwater had maintained on the Grand Canyon journey, including his photographs, in a 209-page volume titled "Delightful Journey". In 1963, he joined the Arizona Society of the Sons of the American Revolution. He was also a lifetime member of the Veterans of Foreign Wars, the American Legion, and Sigma Chi fraternity. He belonged to both the York Rite and Scottish Rite of Freemasonry and was awarded the 33rd degree in the Scottish Rite. Hobbies and interests. Amateur radio. Goldwater was an avid amateur radio operator from the early 1920s, with the call signs 6BPI, K3UIG and K7UGA. The last one is used by an Arizona club honoring him as a commemorative call. During the Vietnam War he was a Military Affiliate Radio System (MARS) operator. Goldwater was a spokesman for amateur radio and its enthusiasts. Beginning in 1969 and for the rest of his life, he appeared in many educational and promotional films (and later videos) about the hobby, produced for the American Radio Relay League (the national society representing the interests of radio amateurs) by such producers as Dave Bell (W6AQ), ARRL Southwest Director John R. Griggs (W6KW), Alan Kaul (W6RCL), Forrest Oden (N6ENV), and Roy Neal (K6DUE). His first appearance was in Dave Bell's "The World of Amateur Radio" where Goldwater discussed the history of the hobby and demonstrated a live contact with Antarctica. His last on-screen appearance dealing with "ham radio" was in 1994, explaining an upcoming Earth-orbiting ham radio relay satellite. Electronics was a hobby for Goldwater beyond amateur radio. He enjoyed assembling Heathkits, completing more than 100 and often visiting their maker in Benton Harbor, Michigan, to buy more, before the company exited the kit business in 1992. Kachina dolls. In 1916, Goldwater visited the Hopi reservation with Phoenix architect John Rinker Kibby and obtained his first kachina doll. Eventually his collection had 437 dolls and was presented in 1969 to the Heard Museum in Phoenix. Photography. Goldwater was an amateur photographer and, in his estate, left some 15,000 of his images to three Arizona institutions. He was keen on candid photography. He became interested in the hobby after receiving a camera as a gift from his wife on their first Christmas. He used a 4×5 Graflex, Rolleiflex, 16 mm Bell and Howell motion picture camera, and 35 mm Nikkormat FT. He was a member of the Royal Photographic Society from 1941, becoming a Life Member in 1948. For decades, he contributed photographs of his home state to "Arizona Highways" and was recognized for his Western landscapes and pictures of native Americans in the United States. Three books with his photographs are "People and Places" (1967); "Barry Goldwater and the Southwest" (1976); and "Delightful Journey", (1940, reprinted 1970). Ansel Adams wrote a foreword to the 1976 book. Goldwater's photography interest occasionally crossed into his political career. John F. Kennedy, as president, would sometimes invite former congressional colleagues to the White House for a drink. On one occasion, Goldwater brought his camera and photographed President Kennedy. When Kennedy received the photo, he returned it to Goldwater with the inscription: "For Barry Goldwater—Whom I urge to follow the career for which he has shown such talent—photography!—from his friend—John Kennedy." This quip became a classic of American political humor after it was relayed by humorist Bennett Cerf. The photo was prized by Goldwater for the rest of his life and sold for $17,925 in a 2010 Heritage auction. Son Michael Prescott Goldwater formed the Goldwater Family Foundation with the goal of making his father's photography available via the internet. ("Barry Goldwater Photographs") was launched in September 2006 to coincide with the HBO documentary "Mr. Conservative", produced by granddaughter CC Goldwater. UFOs. On March 28, 1975, Goldwater wrote to Shlomo Arnon: "The subject of UFOs has interested me for some long time. About ten or twelve years ago I made an effort to find out what was in the building at Wright-Patterson Air Force Base where the information has been stored that has been collected by the Air Force, and I was understandably denied this request. It is still classified above Top Secret." Goldwater further wrote that there were rumors the evidence would be released, and that he was "just as anxious to see this material as you are, and I hope we will not have to wait much longer". The April 25, 1988, issue of "The New Yorker" carried an interview with Goldwater in which he recounted efforts to gain access to the room. He did so again in a 1994 "Larry King Live" interview, saying: Death. Goldwater's public appearances ended in late 1996 after he had a massive stroke. Family members disclosed he was also in the early stages of Alzheimer's disease. He died on May 29, 1998, at the age of 89, at his long-time home in Paradise Valley, Arizona, of complications from the stroke. His funeral was officiated by both a Christian minister and a rabbi. His ashes were buried at Episcopal Christ Church of the Ascension in Paradise Valley. A memorial statue was erected in a small park in Paradise Valley near his home and resting place, honoring his memory. Legacy. Buildings and monuments. Among the buildings and monuments named after Barry Goldwater are the Barry M. Goldwater Terminal at Phoenix Sky Harbor International Airport, Goldwater Memorial Park in Paradise Valley, Arizona, the Barry Goldwater Air Force Academy Visitor Center at the United States Air Force Academy, and Barry Goldwater High School in northern Phoenix. In 2010, former Arizona Attorney General Grant Woods, himself a Goldwater scholar and supporter, founded the Goldwater Women's Tennis Classic Tournament to be held annually at the Phoenix Country Club in Phoenix. On February 11, 2015, a statue of Goldwater by Deborah Copenhaver Fellows was unveiled by U.S. House and Senate leaders at a dedication ceremony in National Statuary Hall of the U.S. Capitol building in Washington, D.C. Barry Goldwater Peak is the highest peak in the White Tank Mountains. Barry M. Goldwater Scholarship. The Barry M. Goldwater Scholarship and Excellence in Education Program was established by Congress in 1986. Its goal is to provide a continuing source of highly qualified scientists, mathematicians, and engineers by awarding scholarships to college students who intend to pursue careers in these fields. The Scholarship is widely considered the most prestigious award in the U.S. conferred upon undergraduates studying the sciences. It is awarded to about 400 students (college sophomores and juniors) nationwide in the amount of $7,500 per academic year (for their senior year, or junior and senior years). It honors Goldwater's keen interest in science and technology. Documentary. Goldwater's granddaughter, CC Goldwater, has co-produced with longtime friend and independent film producer Tani L. Cohen a documentary on Goldwater's life, "Mr. Conservative: Goldwater on Goldwater", first shown on HBO on September 18, 2006. In popular culture. In his song "I Shall Be Free No. 10", Bob Dylan refers to Goldwater: "I'm liberal to a degree, I want everybody to be free. But if you think I'll let Barry Goldwater move in next door and marry my daughter, you must think I'm crazy." In the 1965 film "The Bedford Incident", the actor Richard Widmark playing the film's antagonist, Captain Eric Finlander of the fictional destroyer USS "Bedford", modelled his character's mannerisms and rhetorical style after Goldwater. Relatives. Goldwater's son Barry Goldwater Jr. served as a Congressman from California from 1969 to 1983. He was the first Congressman to serve while having a father in the Senate. Goldwater's uncle Morris Goldwater served in the Arizona territorial and state legislatures and as mayor of Prescott, Arizona. Goldwater's nephew Don Goldwater sought the Republican nomination for governor of Arizona in 2006, but he was defeated by Len Munsil.
4794
7903804
https://en.wikipedia.org/wiki?curid=4794
Baralong incidents
The Baralong incidents were two incidents during the First World War in August and September 1915, involving the Royal Navy Q-ship and two German U-boats. "Baralong" sank , which had been attacking a nearby merchant ship, the "Nicosian". About a dozen of the crewmen managed to escape from the sinking submarine and Lieutenant Godfrey Herbert, commanding officer of "Baralong", ordered the survivors to be fired on. All the survivors of "U-27"s sinking, including several who had reached the "Nicosian", were shot by "Baralong"s crew and attached marines. Later, "Baralong" under command of Andrew Wilmot-Smith sank in an incident which has also been described as a British war crime. First incident. Action of 19 August 1915. After the sinking of by a German submarine in May 1915, Lieutenant-Commander Godfrey Herbert, commanding officer of "Baralong", was visited by two officers of the Admiralty's Secret Service branch at the naval base at Queenstown, Ireland. He was told, "This "Lusitania" business is shocking. Unofficially, we are telling you... take no prisoners from U-boats." Interviews with his subordinate officers have established Herbert's undisciplined manner of commanding his ship. Herbert allowed his men to engage in drunken binges during shore leave. During one such incident, at Dartmouth, several members of "Baralong"s crew were arrested after destroying a local pub. Herbert paid their bail, then left port with the bailed crewmen aboard. Beginning in April 1915, Herbert ordered his subordinates to cease calling him "Sir", and to address him only by the pseudonym "Captain William McBride". Throughout the summer of 1915, "Baralong" continued routine patrol duties in the Irish Sea without encountering the enemy. On 19 August 1915, sank the White Star Liner with the loss of 44 lives – this included three Americans and resulted in a diplomatic incident between Germany and the United States. "Baralong"s crew was infuriated by the attack. Meanwhile, about south of Queenstown, , commanded by "Kapitänleutnant" Bernd Wegener, stopped the British steamer "Nicosian" with a warning shot in accordance with cruiser rules. The ship was carrying 354 American mules earmarked for the British Army in France. The Germans allowed the freighter's crew and passengers to board lifeboats, and prepared to sink the freighter with the U-boat's deck gun. However, "Nicosian" had sent out SOS messages about her plight. Gibson and Prendergast claim that these messages were still being sent from the ship when "Baralong" arrived, implying that at least some crew were still on board while "U-27" commenced shelling. Halpern equivocates on the issue: they may or may not all have abandoned ship by that time. The messages also told "Baralong", inaccurately, that a second submarine was present. "U-27" was lying off "Nicosian"s port quarter and firing into it when "Baralong" appeared on the scene, flying the ensign of the United States as a false flag. When she was half a mile away, "Baralong" ran up a signal flag indicating that she was going to rescue "Nicosian"s crew. Wegener acknowledged the signal, then ordered his men to cease firing, and took "U-27" along the port side of "Nicosian" to intercept "Baralong". As the submarine disappeared behind the steamship, Herbert steered "Baralong" on a parallel course along "Nicosian"s starboard side. Before "U-27" came round "Nicosian"s bow, "Baralong" hauled down the American flag, hoisted the Royal Navy's White Ensign, and unmasked her guns. As "U-27" came into view from behind "Nicosian", "Baralong" began shooting with its three 12-pounder guns at a range of , firing 34 rounds. "U-27" rolled over and began to sink. According to Tony Bridgland; Herbert screamed, "Cease fire!" But his men's blood was up. They were avenging the "Arabic" and the "Lusitania". For them this was no time to cease firing, even as the survivors of the crew appeared on the outer casing, struggling out of their clothes to swim away from her. There was a mighty hiss of compressed air from her tanks and the "U-27" vanished from sight in a vortex of giant rumbling bubbles, leaving a pall of smoke over the spot where she had been. It had taken only a few minutes to fire the thirty-four shells into her. Meanwhile, "Nicosian"s crew were cheering from the lifeboats. Captain Manning was heard to yell, "If any of those bastard Huns come up, lads, hit 'em with an oar!" Twelve men survived the sinking of the submarine: the crews of her two deck guns and those who had been on the conning tower. They swam to "Nicosian" and attempted to climb up its hanging lifeboat falls and pilot ladder. Herbert claimed in his report to the Admiralty to have been worried that the German survivors might try to scuttle the steamer as an explanation for why he ordered his men to open fire with small arms. Between four and six managed to get on board. Wegener is described by some accounts as being shot while trying to swim to the "Baralong", in other accounts he boarded the ship. Herbert was told by the "Nicosian"s master that the crew of the vessel kept weapons aboard in the chart room. He then sent "Baralong"s 12 Royal Marines, commanded by a Corporal Collins, to find the surviving German sailors aboard "Nicosian". As they departed, Herbert ordered Collins, "Take no prisoners." The Germans were discovered in the engine room and shot on sight. According to Sub-Lieutenant Gordon Charles Steele: "Wegener ran to a cabin on the upper deck – I later found out it was Manning's bathroom. The marines broke down the door with the butts of their rifles, but Wegener squeezed through a scuttle and dropped into the sea. He still had his life-jacket on and put up his arms in surrender. Corporal Collins, however, took aim and shot him through the head." Corporal Collins later recalled that, after Wegener's death, Herbert threw a revolver in the dead German captain's face and screamed, "What about the "Lusitania", you bastard!" An alternative allegation by the Admiralty is that the Germans who boarded "Nicosian" were killed by the freighter's engine room staff; this report apparently came from the officer commanding the muleteers. Aftermath. In Herbert's report to the Admiralty, he stated he feared the survivors from the U-boat's crew would board the freighter and scuttle it, so he ordered the Royal Marines on his ship to shoot the survivors. If they had scuttled the freighter, it could have been considered as negligence on the part of Herbert. Moments before "Baralong" began its attack, the submarine was firing on the freighter. It is not known if the escaping sailors actually intended to scuttle the freighter. The Admiralty, upon receiving Herbert's report, immediately ordered its suppression, but the strict censorship imposed on the event failed when Americans who had witnessed the incident from "Nicosian"s lifeboats spoke to newspaper reporters after their return to the United States. German memorandum. The German government delivered a memorandum on the incident via the American ambassador in Berlin, who received it on 6 December 1915. In it, they cited six US citizens as witnesses, stating they had made sworn depositions regarding the incident before notaries public in the USA. The statements said that five survivors from "U-27" managed to board "Nicosian", while the rest were shot and killed on Herbert's orders while clinging to the merchant vessel's lifeboat falls. It was further stated that when Herbert ordered his Marines to board "Nicosian", he gave the order "take no prisoners". Four German sailors were found in "Nicosian"s engine room and propeller shaft tunnel, and were killed. According to the witness statements, "U-27"s commander was shot while swimming towards "Baralong". The memorandum demanded that the captain and crew of "Baralong" be tried for the murder of unarmed German sailors, threatening to "take the serious decision of retribution for an unpunished crime". Sir Edward Grey replied through the American ambassador that the incident could be grouped together with three "German crimes" that also took place within 48 hours: the Germans' sinking of SS "Arabic", their attack on a stranded British submarine on the neutral Danish coast, and their attack on the crew of the steamship "Ruel" after they had abandoned ship, and suggested that they be placed before a tribunal composed of US Navy officers. This was rejected by German authorities. German reaction. A debate took place in the "Reichstag" on 15 January 1916, where the incident was described as a "cowardly murder" and Grey's note as being "full of insolence and arrogance". It was announced that reprisals had been decided, but not what they would be. Meanwhile, the Military Bureau for the Investigation of Violations of the Laws of War () added "Baralong"s commanding officer, whose name was known only as "Captain William McBride", to the Prussian Ministry of War's "Black List of Englishmen who are Guilty of Violations of the Laws of War vis-à-vis Members of the German Armed Forces". A German medal was issued commemorating the event. As a precaution to protect the ships against any reprisals against their crews, in October 1915 HMS "Baralong" was renamed HMS "Wyandra" and transferred to the Mediterranean. "Baralong"s name was deleted from Lloyd's Register. In 1916 "Wyandra" returned to the Ellerman & Bucknall Line under the name "Manica". "Nicosian" was renamed "Nevisian", and the crew was issued new Discharge Books, with the voyage omitted. "Baralong"s crew were later awarded £185 prize bounty for sinking "U-27". Second incident. Action of 24 September 1915. On 24 September 1915, "Baralong" sank the U-boat , for which its commanding officer at the time, Lieutenant-Commander Andrew Wilmot-Smith was awarded the DSO, the engineer J. M. Dowie the DSC, and two of the crew received a DSM. A bounty of £1,000 was also awarded. Wilmot-Smith, was later awarded £170 prize bounty by the Prize Court. "U-41" was in the process of sinking SS "Urbino" with gunfire when "Baralong", which had set out from Falmouth the day before, arrived on the scene, flying an American flag. "Baralong" followed U-41's instructions while at the same time manoevring to 700 yards and an angle where her guns could fire. "Baralong" opened fire with starboard and rear guns, marines aiding with rifle fire. The conning tower was struck killing the captain and six crew, and other shots struck the hull. "U-41" began to list then dived. It abruptly resurfaced and only two crew escaped (a wounded "Leutnant" and the helmsman) from a hatch before it sank again. The two crew and the crew of the "Urbino" were picked up by "Baralong" before it returned to Falmouth the following morning. When "U-41" surfaced near "Baralong", the latter allegedly opened fire while continuing to fly the American flag, and sank the U-boat. Aftermath of the second incident. Unlike the neutral Americans in the first incident, the only witnesses to the second attack were the German and British sailors present. "Oberleutnant zur See" Iwan Crompton, after returning to Germany from a prisoner-of-war camp, reported that "Baralong" had run down the lifeboat he was in; he leapt clear and was soon afterward taken aboard "Baralong". The British crew denied that they had run down the lifeboat. Crompton later published an account of "U-41"s exploits in 1917, "U-41: der zweite Baralong-Fall", which termed the sinking of "U-41" a "second Baralong case". The event was also commemorated by a propaganda medal designed by the German engraver Karl Goetz. This was one of many medals that were popular in Germany from about 1910 to 1940.
4796
6320553
https://en.wikipedia.org/wiki?curid=4796
Bladder (disambiguation)
The bladder (or urinary bladder) is an organ that collects urine for excretion in animals. Bladder may also refer to:
4797
16420254
https://en.wikipedia.org/wiki?curid=4797
Bob Young (businessman)
Robert Young (born 1953/1954) is a businessman who is best known for founding Red Hat Inc., the open source software company. He owns the franchises for Forge FC of the Canadian Premier League as well as the Hamilton Tiger-Cats of the Canadian Football League for which he is self-styled caretaker of the team. Early life. He was born in Hamilton, Ontario, Canada. He attended Trinity College School in Port Hope, Ontario. He received a Bachelor of Arts from Victoria College at the University of Toronto. Career. Prior to Red Hat, Young built a couple of computer rental and leasing businesses, including founding Vernon Computer Rentals in 1984. Descendants of Vernon are still operating under that name. After leaving Vernon, Young founded the ACC Corp Inc. in 1993. Marc Ewing and Young co-founded open-source software company Red Hat. Red Hat was a member of the S&P 500 Index before being purchased by IBM on July 9, 2019. Marc Ewing and Young's partnership started in 1994 when ACC acquired the Red Hat trademarks from Ewing. In early 1995, ACC changed its name to Red Hat Software, which has subsequently been shortened to simply Red Hat, Inc. Young was Red Hat's CEO until 1999. In 2002, Young founded Lulu.com, a print-on-demand, self-publishing company, and was CEO. In 2006, Young established the Lulu Blooker Prize, a book prize for books that began as blogs. He launched the prize partly as a means to promote Lulu. Young was CEO of PrecisionHawk, a commercial drone technology company, from 2015 to 2017. Prior to being named PrecisionHawk's CEO in 2015, he was an early investor in the company. He continues on its board as chairman. Young also co-founded "Linux Journal" in 1994, and in 2003, he purchased the Hamilton Tiger-Cats of the Canadian Football League. In 2022, he sold minority stakes in the Tiger-Cats to Jim Lawson, team President Scott Mitchell, and American steel manufacturer Stelco. Young focuses his philanthropic efforts on access to information and advancement of knowledge. In 1999, he co-founded The Center for the Public Domain. Young has supported the Creative Commons, Public Knowledge.org, the Dictionary of Old English, Loran Scholarship Foundation, ibiblio.org, and the NCSU eGames, among others.
4800
12416903
https://en.wikipedia.org/wiki?curid=4800
Babylon 5
Babylon 5 is an American space opera television series created by writer and producer J. Michael Straczynski, under the Babylonian Productions label, in association with Straczynski's Synthetic Worlds Ltd. and Warner Bros. Domestic Television. After the successful airing of a test pilot movie on February 22, 1993, "", Warner Bros. commissioned the series for production in May 1993 as part of its Prime Time Entertainment Network (PTEN). The show premiered in the United States on January 26, 1994, and ran for five 22-episode seasons. The series follows the human military staff and alien diplomats stationed on a space station, "Babylon 5", built in the aftermath of several major inter-species wars as a neutral ground for galactic diplomacy and trade. Major plotlines included intra-race intrigue and upheaval, inter-race wars and their aftermaths, and embroilment in a millennial cyclic conflict between ancient races. The human characters, in particular, become pivotal to the resistance against Earth's descent into totalitarianism. Many episodes focused on the effect of wider events on individual characters. Episodes contained themes such as personal change, loss, oppression, corruption, and redemption. Unusually for American broadcast television at the time of its airing, "Babylon 5" was conceived as a "novel for television" with a pre-planned five-year story arc, each episode envisioned as a "chapter". Whereas contemporaneous television shows tended to maintain the overall status quo, confining conflicts to individual episodes, "Babylon 5" featured story arcs which spanned multiple episodes and even seasons, effecting permanent changes to the series universe. Tie-in novels, comic books, and short stories were also developed to play a significant canonical part in the overall story. Straczynski announced plans for a reboot of the series in September 2021 in conjunction with Warner Bros. Television. An animated feature-length, direct-to-video film, "", was released in August 2023. Setting. The main "Babylon 5" story arc occurs between the years 2257 and 2262. The show depicts a future where Earth has a unified Earth government and has gained the technology for faster-than-light travel using "jump gates", a kind of wormhole technology allowing transport through the alternate dimension of hyperspace. The Colonies within the Solar System and beyond make up the Earth Alliance, which has established contact with other spacefaring species. Ten years before the series is set, Earth barely escaped destruction by the technologically superior Minbari, who sought revenge after an Earth starship unwittingly killed their leader during first contact, only for them to unexpectedly surrender on the brink of victory. Earth has since established peaceful relationships with them and the Earth Alliance has become a significant and generally respected power within the galactic community. Among the other species are the imperialist Centauri; the Narn, who only recently gained independence from the Centauri empire; and the mysterious, powerful Vorlons. Several dozen less powerful species from the League of Non-Aligned Worlds also have diplomatic contact with the major races, including the Drazi, Brakiri, Vree, Markab, and pak'ma'ra. An ancient and secretive race, the Shadows, unknown to humans but documented in many other races' religious texts, malevolently influence events to bring chaos and war among the known species. Among the chaos the Shadows cause is a Centauri descent into irredentism and Earth sliding into totalitarianism under President Morgan Clark. The "Babylon 5" space station is located in the Epsilon Eridani system, at the fifth Lagrangian point of the fictional planet Epsilon III and its moon. It is an O'Neill cylinder long and in diameter. The station is the last of its line; the first three stations were all destroyed during construction, while "Babylon 4" was completed but mysteriously vanished shortly after being made operational. It contains living areas which accommodate various alien species, providing differing atmospheres and gravities. Human visitors to the alien sectors are shown using breathing equipment and other measures to tolerate the conditions. Cast. Regular cast. "Babylon 5" featured an ensemble cast which changed over the course of the show's run: Recurring guests. In addition, several other actors filled more than one minor role on the series. Kim Strauss played the Drazi Ambassador in four episodes, as well as nine other characters in ten more episodes. Some actors had difficulty dealing with the application of prosthetics required to play some of the alien characters. The producers therefore used the same group of people (as many as 12) in various mid-level speaking roles, taking full head and body casts from each. The group came to be unofficially known by the production as the "Babylon 5 Alien Rep Group." Synopsis. The five seasons of the series each correspond to one fictional sequential year in the period 2258–2262. Each season shares its title with an episode that is central to that season's plot. Pilot film (1993). In the pilot film, "", the Vorlon ambassador Kosh is nearly killed by an assassin shortly after arriving at the station. Jeffrey Sinclair, the commander of "Babylon 5", is named as the prime suspect, but is proven to have been framed. Season 1: Signs and Portents (1994). Commander Sinclair, a hero of the Minbari war, is troubled by his inability to remember events of the war's last day. Though supported by Minbari ambassador Delenn, who is secretly a member of the Minbari ruling Grey Council, other Minbari remain distrustful of him. The Narn ambassador G'Kar continually presses for concessions from their former overlords the Centauri Republic. Centauri ambassador Londo Mollari finds a new ally in the enigmatic Mr. Morden to strike back at the Narn. Meanwhile, xenophobic groups on Earth challenge humanity's tolerance of aliens. This tension culminates in the assassination of Earth's President Santiago, who favored such contact. Season 2: The Coming of Shadows (1994–1995). Sinclair is transferred to be ambassador to Minbar, and General Hague assigns captain John Sheridan command of the station. Hague and Sheridan believe now-president Clark conspired in Santiago's death but have no proof. Clark gradually moves Earth in an isolationist direction and takes steps to install a totalitarian government. When the aging Centauri Emperor Turhan dies, Mollari and his ally Lord Refa install Turhan's unstable nephew Cartagia as emperor and force a war against the Narn. Aided by Mr. Morden's "associates" the Shadows, the Centauri decimate the Narn. The war ends with a planetary bombardment of the Narn homeworld, followed by the enslavement of the surviving Narns. Delenn and Vorlon ambassador Kosh request Sheridan's help to fight against their ancient foe, the Shadows. Season 3: Point of No Return (1995–1996). Sheridan and Delenn establish a "conspiracy of light" to fight the influence of the Shadows. When Clark declares martial law, Sheridan declares "Babylon 5" independence from the Earth government. Mollari realizes his deal with Mr. Morden has become dangerous but is unable to end it. As the Shadows cause conflict and chaos throughout the galaxy, Sheridan confronts Kosh and successfully convinces the Vorlons to provide military assistance against them. In retaliation for Vorlon intervention, the Shadows assassinate Kosh. Sinclair travels back in time a thousand years to aid the Minbari in the previous Shadow War, becoming the legendary Minbari religious leader Valen. Sheridan discovers vulnerabilities in the Shadow vessels and learns to predict their objectives, leading to the first major military defeat of the Shadows. Despite Kosh's warnings, Sheridan confronts the Shadows on their homeworld Z'ha'dum. He crashes a spacecraft packed with nuclear weapons into the planet, seemingly dying in the explosion. Season 4: No Surrender, No Retreat (1996–1997). Sheridan is rescued from Z'ha'dum by the mysterious Lorien. With the Shadows in retreat, the Vorlons begin destroying any planet allied with or influenced by the Shadows. Mollari overthrows the mad emperor Cartagia with the aid of G'Kar in exchange for the liberation of the Narn from Centauri rule. Mollari betrays the Shadows in order to save the Centauri homeworld from the Vorlons. Sheridan realizes the Vorlons and Shadows have used the younger races in a proxy war, and convinces both sides to permanently end their conflict and to leave the younger races alone in peace. Sheridan next refocuses on returning democracy to Earth. He forges a new Interstellar Alliance along with the Minbari, Centauri, and Narn governments. With their help, Sheridan is able to win the Earth civil war and forces President Clark out of office. Sheridan is forced to resign from the Earth military, but is named president of the Interstellar Alliance. Season 5: The Wheel of Fire (1998). An ex-lover of Sheridan's, Elizabeth Lochley, is assigned to command the station. A group of rogue human telepaths take sanctuary on the station, seeking Sheridan's aid to escape the control of Psi Corps, the autocratic Earth agency that oversees telepaths. The Interstellar Alliance refuses to grant them a planet of their own, and they are eventually expelled from the station. Meanwhile, the Drakh, former supporters of the Shadows, seek revenge for the Shadows' defeat. They infiltrate the Centauri government and orchestrate attacks against other Alliance members. Mollari attempts to purge the alien manipulation of his government but is too late. After a devastating attack by Alliance forces on Centauri Prime, Mollari is installed as emperor, but under Drakh control. He then withdraws the Centauri from the Interstellar Alliance. Twenty years later, Sheridan has a last reunion with his friends before leaving to join Lorien and the older races "beyond the rim". Spin-offs and television movies. The original show spawned a multimedia franchise of spin-offs consisting of a miniseries, five television movies, twenty-two novels, two tabletop games (an RPG and a wargame), and various other media such as technical books, comics, and trading cards. Production. Origin. Having worked on a number of television science fiction shows which had regularly gone over budget, Straczynski concluded that a lack of long-term planning was to blame, and set about looking at ways in which a series could be done responsibly. Taking note of the lessons of mainstream television, which brought stories to a centralized location such as a hospital, police station, or law office, he decided that instead of "[going] in search of new worlds, building them anew each week", a fixed space station setting would keep costs at a reasonable level. A fan of sagas such as the "Foundation" series, "Childhood's End", "The Lord of the Rings", "Dune" and the "Lensman" series, Straczynski wondered why no one had done a television series with the same epic sweep, and concurrently with the first idea started developing the concept for a vastly ambitious epic covering massive battles and other universe-changing events. Realizing that both the fixed-locale series and the epic could be done in a single series, he began to sketch the initial outline of what would become "Babylon 5". Straczynski set five goals for "Babylon 5". He said that the show "would have to be good science fiction". It would also have to be good television, "and rarely are SF shows both good SF *and* good TV; there're generally one or the other." It would have to do for science fiction television what "Hill Street Blues" had done for police dramas, by taking an adult approach to the subject. It would have to be reasonably budgeted, and "it would have to look unlike anything ever seen before on TV, presenting individual stories against a much broader canvas." He further stressed that his approach was "to take [science fiction] seriously, to build characters for grown-ups, to incorporate real science but keep the characters at the center of the story." Some of the staples of television science fiction were also out of the question (the show would have "no kids or cute robots"). The idea was not to present a perfect utopian future, but one with greed and homelessness; one where characters grow, develop, live, and die; one where not everything was the same at the end of the day's events. Citing Mark Twain as an influence, Straczynski said he wanted the show to be a mirror to the real world and to covertly teach. Following production on "Captain Power and the Soldiers of the Future", Straczynski approached John Copeland and Doug Netter, who had also been involved with "Captain Power" and showed him the bible and pilot script for his show, and both were impressed with his ideas. They were able to secure an order for the pilot from Warner Bros. who were looking at the time to get programming for a planned broadcast network. Warner Bros. had remained skeptical about the show even after greenlighting the pilot. According to Straczynski, Warner Bros. had three main concerns: that American attention spans were too short for a series-long narrative to work, that it would be difficult to sell the show into syndication as the syndicate networks would air the episodes out of order, and that no other science-fiction television show outside of "Star Trek" had gone more than three seasons before being canceled. Straczynski had proved out that the syndication fear was incorrect, since syndicate stations told him they show their shows in episode order to track broadcasts for royalties; however, he could not assure Warner Bros. about the attention span or premature cancellation concerns, but still set out to show Warner Bros. they were wrong. Writing. Straczynski wrote 92 of the 110 episodes of "Babylon 5", including all 44 episodes in the third and fourth seasons, a feat never before accomplished in American television. Other writers to have contributed scripts to the show include Peter David, Neil Gaiman, Kathryn M. Drennan, Lawrence G. DiTillio, D. C. Fontana, and David Gerrold. Harlan Ellison, a creative consultant on the show, received story credits for two episodes. Each writer was informed of the overarching storyline, enabling the show to be produced consistently under-budget. The rules of production were strict; scripts were written six episodes in advance, and changes could not be made once production had started. With not all cast members being hired for every episode of a season, the five-year plot length caused some planning difficulties. If a critical scene involving an actor not hired for every episode had to be moved, that actor had to be paid for work on an extra episode. It was sometimes necessary to adjust the plotline to accommodate external influences, an example being the "trap door" that was written for every character: in the event of that actor's unexpected departure from the series, the character could be written out with minimal impact on the storyline. Straczynski stated, "As a writer, doing a long-term story, it'd be dangerous and short-sighted for me to construct the story without trap doors for every single character. ... That was one of the big risks going into a long-term storyline which I considered long in advance;..." This device was eventually used to facilitate the departures of Claudia Christian and Andrea Thompson from the series. Straczynski purposely went light on elements of the five-year narrative during the first season as he felt the audience would not be ready for the full narrative at that time, but he still managed to drop in some scenes that would be critical to the future narrative. This also made it challenging for the actors to understand their motivations without knowing where their characters were going; Straczynski said "I didn't want to tell them too much, because that risks having them play the result, rather than the process." He recalled that Peter Jurasik had asked him about the context of Londo's premonition, shown partially in "Midnight on the Firing Line", of himself and G'Kar choking each other to death, but Straczynski had to be coy about it. The full death scene was shown in context in "War Without End - Part 2" near the end of the third season. During production of the fourth season, the Prime Time Entertainment Network, which Warner Bros. opted to use for "Babylon 5", was shut down, leaving the planned fifth season in doubt. Unwilling to short-change fans of the show, Straczynski began preparing modifications to the fourth season that would allow him to conclude his overall arc should a fifth season not be greenlit, which ultimately became the direction the fourth season took. Straczynski identified three primary narrative threads which would require resolution: the Shadow war, Earth's slide into a dictatorship, and a series of sub-threads which branched off from those. Estimating they would still take around 27 episodes to resolve without having the season feel rushed, the solution came when the TNT network commissioned two "Babylon 5" television films. Several hours of material was thus able to be moved into the films, including a three-episode arc which would deal with the background to the Earth–Minbari War, and a sub-thread which would have set up the sequel series, "Crusade". Further standalone episodes and plot-threads were dropped from season four, which could be inserted into "Crusade", or the fifth season, were it to be given the greenlight. The intended series finale, "Sleeping in Light", was filmed during season four as a precaution against cancellation. When word came that TNT had picked up "Babylon 5", this was moved to the end of season five and replaced with a newly filmed season four finale, "The Deconstruction of Falling Stars". Costume. Ann Bruice Aling was costume designer for the show, after production designer John Iacovelli suggested her for the position, having previously worked with her on a number of film and theatrical productions. With the variety of costumes required she compared "Babylon 5" to "eclectic theatre", with fewer rules about period, line, shape and textures having to be adhered to. Preferring natural materials whenever possible, such as ostrich leather in the Narn body armor, Bruice combined and layered fabrics as diverse as rayon and silk with brocades from the 1930s and '40s to give the clothing the appearance of having evolved within different cultures. With an interest in costume history, she initially worked closely with Straczynski to get a sense of the historical perspective of the major alien races, "so I knew if they were a peaceful people or a warring people, cold climate etc. and then I would interpret what kind of sensibility that called for." Collaborating with other departments to establish co-ordinated visual themes for each race, a broad palette of colors was developed with Iacovelli, which he referred to as "spicy brights". These warm shades of gray and secondary colors, such as certain blues for the Minbari, would often be included when designing both the costumes and relevant sets. As the main characters evolved, Bruice referred back to Straczynski and producer John Copeland who she viewed as "surprisingly more accessible to me as advisors than other producers and directors", so the costumes could reflect these changes. Ambassador Londo Mollari's purple coat became dark blue and more tailored while his waistcoats became less patterned and brightly colored as Bruice felt "Londo has evolved in my mind from a buffoonish character to one who has become more serious and darker." Normally there were three changes of costume for the primary actors; one for on set, one for the stunt double and one on standby in case of "coffee spills". For human civilians, garments were generally purchased off-the-rack and altered in various ways, such as removing lapels from jackets and shirts while rearranging closures, to suggest future fashions. For some of the main female characters a more couture approach was taken, as in the suits worn by Talia Winters, which Bruice described as being designed and fitted to within "an inch of their life". Costumes for the destitute residents of downbelow would be distressed through a combination of bleaching, sanding, dipping in dye baths and having stage blood added. Like many of the crew on the show, members of the costume department made onscreen cameos. During the season 4 episode "Atonement", the tailors and costume supervisor appeared as the Minbari women fitting Zack Allan for his new uniform as the recently promoted head of security. His complaints, and the subsequent stabbing of him with a needle by costume supervisor Kim Holly, was a light-hearted reference to the previous security uniforms, a design carried over from the pilot movie, which was difficult to work with and wear due to the combination of leather and wool. Prosthetic makeup and animatronics. While the original pilot film featured some aliens which were puppets and animatronics, the decision was made early on in the show's production to portray most alien species as humanoid in appearance. Barring isolated appearances, fully computer-generated aliens were discounted as an idea due to the "massive rendering power" required. Long-term use of puppets and animatronics was also discounted, as Straczynski believed they would not be able to convey "real emotion" without an actor inside. Visuals. In anticipation of the emerging HDTV standard, rather than the usual format, the series was shot in , with the image cropped to 4:3 for initial television transmissions. It was one of the first television shows to use computer technology in creating visual effects, rather than models and miniatures, primarily out of budgetary concerns; Straczynski estimated that each of their episodes cost to make, compared to the cost of each episode of "". The visual effects were achieved using Amiga-based Video Toasters at first, and later Pentium, Macintosh, and Alpha-based systems using LightWave 3D. The effects sequences were designed to simulate Newtonian physics, with particular emphasis on the effects of inertia on the motion of spacecraft. Foundation Imaging provided the special effects for the pilot film (for which it won an Emmy) and the first three seasons of the show, led by Ron Thornton. After co-executive producer Douglas Netter and producer John Copeland approached Straczynski with the idea of producing the effects in-house, Straczynski agreed to replace Foundation, for season 4 and 5, once a new team had been established by Netter Digital, and an equal level of quality was assured, by using similar technology and a number of former Foundation employees. The Emmy-winning alien make-up was provided by Optic Nerve Studios. Music and scoring. Christopher Franke composed and scored the musical soundtrack for all five years of the show when Stewart Copeland, who worked on the original telefilm, was unable to return for the first season due to recording and touring commitments. Initially concerned composing for an episodic television show could become "annoying because of the repetition", Franke found the evolving characters and story of "Babylon 5" afforded him the opportunity to continually take new directions. Given creative freedom by the producers, Franke also orchestrated and mixed all the music, which one reviewer described as having "added another dimension of mystery, suspense, and excitement to the show, with an easily distinguishable character that separates 'Babylon 5 from other sci-fi television entries of the era." With his recording studio in the same building as his home located in the Hollywood Hills, Franke would attend creative meetings before scoring the 25 minutes or so of music for each episode. Using the "acoustic dirt produced by live instruments and the ability to play so well between two semitones" and the "frequency range, dynamics and control" provided by synthesizers, he described his approach "as experimental friendly as possible without leaving the happy marriage between the orchestral and electronic sounds". Using Cubase software through an electronic keyboard, or for more complex pieces a light pen and graphics tablet, he would begin by developing the melodic content round which the ambient components and transitions were added. Using playbacks with digital samples of the appropriate instruments, such as a group of violins, he would decide which tracks to produce electronically or record acoustically. Scores for the acoustic tracks were emailed to his Berlin scoring stage, and would require from four musicians to the full orchestra, with a maximum of 24 present at any one time. One of three conductors would also be required for any score that involved more than six musicians. Franke would direct recording sessions via six fiber optic digital telephone lines to transmit and receive video, music and the SMPTE timecode. The final edit and mixing of the tracks would take place in his Los Angeles studio. A total of 24 episode and three television film soundtracks were released under Franke's record label, Sonic Images Records, between 1995 and 2001. These contain the musical scores in the same chronological order as they played in the corresponding episodes, or television films. Three compilation albums were also produced, containing extensively re-orchestrated and remixed musical passages taken from throughout the series to create more elaborate suites. In 2007 his soundtrack for was released under the Varèse Sarabande record label. Broadcast history. Warner Bros. slotted the show to premiere on its nascent Prime Time Entertainment Network (PTEN). As original content from another studio, it was somewhat anomalous in a stable of syndicated content from Warner Bros. and the cause of some friction between Straczynski's company and Warner Bros. The pilot film, , premiered on February 22, 1993, with strong viewing figures, achieving a 9.7 in the Nielsen national syndication rankings. The regular series initially aired from January 26, 1994 through November 25, 1998, first on PTEN, then in first-run syndication, debuting with a 6.8 rating/10 share. Figures dipped in its second week, and while it posted a solid 5.0 rating/8 share, with an increase in several major markets, ratings for the first season continued to fall, to a low of 3.4 during reruns. Ratings remained low-to-middling throughout the first four seasons, but "Babylon 5" scored well with the demographics required to attract the leading national sponsors and saved up to $300,000 per episode by shooting off the studio lot, therefore remaining profitable. The fifth season, which aired on cable network TNT, had ratings about 1.0% lower than seasons two through four. In the United Kingdom, the show aired every week on Channel 4 without a break, with the result that the last four or five episodes of the early seasons screened in the UK before the US. "Babylon 5" was one of the better-rated US television shows on Channel 4, and achieved high audience Appreciation Indexes, with the season 4 episode "Endgame" achieving the rare feat of beating the prime-time soap operas for first position. Straczynski stated that PTEN only required the show to be profitable for the network to remain in production, and said that while this was the case for its first four seasons, on paper it was always losing money; he also remarked in a 2019 interview that in terms of contractual profit definition the show remained about in the red on paper, and stated that he had therefore never made any profits on "Babylon 5". The entire series cost an estimated $90 million for 110 episodes. "Babylon 5" successfully completed its five-year story arc on November 25, 1998, after five seasons and 109 aired episodes, when TNT aired the 110th (epilogue) episode "Sleeping in Light," which had been filmed as the Season 4 finale, when "Babylon 5" was under threat of ending production at that point. After a fifth season was assured, a new Season 4 finale was used so that "Sleeping in Light" could remain as the series finale. Remastered version. In November 2020 a remastered version of the show in 4:3 format was released to the iTunes Store and Amazon Prime Video. This version uses the original negatives for filmed elements, and algorithmically upscales the digitally created elements to HD resolution with fewer visual artifacts, for a more visually consistent presentation. In January 2021, it was made available for streaming on HBO Max. In February 2023, HBO's license expired and streaming rights were acquired by the free streaming service Tubi. Themes. Throughout its run, "Babylon 5" found ways to portray themes relevant to modern and historical social issues. It marked several firsts in television science fiction, including the exploration of the political and social landscapes of the first human colonies, their interactions with Earth, and the underlying tensions. "Babylon 5" was also one of the first television science fiction shows to denotatively refer to a same-sex relationship. In the show, sexual orientation is as much of an issue as "being left-handed or right-handed". Unrequited love is explored as a source of pain for the characters, though not all the relationships end unhappily. Order vs. chaos; authoritarianism vs. free will. The clash between order and chaos, and the people caught in between, plays an important role in "Babylon 5". The conflict between two unimaginably powerful older races, the Vorlons and the Shadows, is represented as a battle between competing ideologies, each seeking to turn the humans and the other younger races to their beliefs. The Vorlons represent an authoritarian philosophy of unquestioning obedience. Vorlon characters frequently ask, "who are you?" focusing on identity as a catalyst for shaping personal goals; the intention is not to solicit a correct answer, but to "tear down the artifices we construct around ourselves until we're left facing ourselves, not our roles." The Shadows represent another authoritarian philosophy cloaked in a disguise of evolution through fire, of fomenting conflict in order to promote evolutionary progress. Characters affiliated with the Shadows repeatedly ask, "what do you want?" emphasising personal desire and ambition, using it to shape identity, encouraging conflict between groups who choose to serve their own glory or profit. The representation of order and chaos was informed by the Babylonian myth that the universe was born in the conflict between both. The climax of this conflict comes with the younger races' exposing of the Vorlons' and the Shadows' "true faces" and the rejection of both philosophies, heralding the dawn of a new age without their interference. The notion that the war was about "killing your parents" is echoed in the portrayal of the civil war between the human colonies and Earth. Deliberately dealing in historical and political metaphor, with particular emphasis upon McCarthyism and the HUAC, the Earth Alliance becomes increasingly authoritarian, eventually sliding into a dictatorship. The show examines the impositions on civil liberties under the pretext of greater defense against outside threats which aid its rise, and the self-delusion of a populace which believes its moral superiority will never allow a dictatorship to come to power, until it is too late. The successful rebellion led by the Babylon 5 station results in the restoration of a democratic government and true autonomy for Mars and the colonies. War and peace. The "Babylon 5" universe portrays numerous armed conflicts on an interstellar scale, including the Dilgar war, Narn-Centauri conflict, Minbari civil war, Drakh War, Interstellar Alliance-Centauri war, and the Great Burn. The story begins in the aftermath of a war which brought the human race to the brink of extinction, caused by a misunderstanding during a first contact with the Minbari. Babylon 5 is built to foster peace through diplomacy, described as the "last, best hope for peace" in the opening credits monologue during its first three seasons. Wars between separate alien civilizations are featured. The conflict between the Narn and the Centauri is followed from its beginnings as a minor territorial dispute amplified by historical animosity, through to its end, in which weapons of mass destruction are employed to subjugate and enslave a planet. The war is an attempt to portray a more sobering kind of conflict than usually seen on science fiction television. Informed by the events of the first Gulf War, the Cuban Missile Crisis and the Soviet invasion of Prague, the intent was to recreate these moments when "the world held its breath" and the emotional core of the conflict was the disbelief that the situation could have occurred at all, and the desperation to find a way to bring it to an end. By the start of the third season, the opening monolog had changed to say that the hope for peace had "failed" and the Babylon 5 station had become the "last, best hope for victory", indicating that while peace is ostensibly a laudable goal, it can also mean a capitulation to an enemy intent on committing horrendous acts and that "peace is a byproduct of victory against those who do not want peace." The Shadow War also features prominently in the show, wherein the Shadows work to instigate conflict between other races to promote technological and cultural advancement, opposed by the Vorlons who are attempting to impose their own authoritarian philosophy of obedience. The gradual discovery of the scheme and the rebellion against it underpin the first three seasons, but also as a wider metaphor for competing forces of order and chaos. In that respect, Straczynski stated he presented Earth's descent into a dictatorship as its own "shadow war". In ending the Shadow War before the conclusion of the series, the show was able to more fully explore its aftermath, and it is this "war at home" which forms the bulk of the remaining two seasons. The struggle for independence between Mars and Earth culminates with a civil war between the human colonies (led by the Babylon 5 station) and the home planet. Choosing Mars as both the spark for the civil war, and the staging ground for its dramatic conclusion, enabled the viewer to understand the conflict more fully than had it involved an anonymous colony orbiting a distant star. The conflict, and the reasons behind it, were informed by Nazism, McCarthyism and the breakup of Yugoslavia, and the destruction of the state also served as partial inspiration for the Minbari civil war. The post-war landscape has its roots in the Reconstruction. The attempt to resolve the issues of the American Civil War after the conflict had ended, and this struggle for survival in a changed world was also informed by works such as "Alas, Babylon", a novel dealing with the after-effects of a nuclear war on a small American town. The show expresses that the end of these wars is not an end to war itself. Events shown hundreds of years into the show's future tell of wars which will once again bring the human race to the edge of annihilation, demonstrating that humanity will not change, and the best that can be hoped for after it falls is that it climbs a little higher each time, until it can one day "take [its] place among the stars, teaching those who follow." Religion. Many of Earth's contemporary religions are shown to still exist, with the main human characters often having religious convictions. Among those specifically identified are the Roman Catholic branch of Christianity (including the Jesuits), Judaism, and the fictional Foundationism (which developed after first contact with alien races). Alien beliefs in the show range from the Centauri's Bacchanalian-influenced religions, of which there are up to seventy different denominations, to the more pantheistic as with the Narn and Minbari religions. In the show's third season, a community of Cistercian monks takes up residence on the Babylon 5 station, in order to learn what other races call God, and to come to a better understanding of the different religions through study at close quarters. References to both human and alien religion is often subtle and brief, but can also form the main theme of an episode. The first season episode "The Parliament of Dreams" is a conventional "showcase" for religion, in which each species on the Babylon 5 station has an opportunity to demonstrate its beliefs (humanity's are presented as being numerous and varied), while "Passing Through Gethsemane" focuses on a specific position of Roman Catholic beliefs, as well as concepts of justice, vengeance, and biblical forgiveness. Other treatments have been more contentious, such as the David Gerrold-scripted "Believers", in which alien parents would rather see their son die than undergo a life-saving operation because their religious beliefs forbid it. When religion is an integral part of an episode, various characters express differing view points. In the episode "Soul Hunter", where the concept of an immortal soul is touched upon, and whether after death it is destroyed, reincarnated, or simply does not exist. The character arguing the latter, Doctor Stephen Franklin, often appears in the more spiritual storylines as his scientific rationality is used to create dramatic conflict. Undercurrents of religions such as Buddhism have been viewed by some in various episode scripts, and while identifying himself as an atheist, Straczynski believes that passages of dialog can take on distinct meanings to viewers of differing faiths, and that the show ultimately expresses ideas which cross religious boundaries. Addiction. Substance abuse and its impact on human personalities also features in the "Babylon 5" storyline. Garibaldi is a relapsing-remitting alcoholic, who practices complete abstinence throughout most of the series until the middle of season five, only recovering at the end of the season. Zack Allan, his eventual replacement as chief of security, was given a second chance by Garibaldi after overcoming his own addiction to an unspecified drug. Dr. Stephen Franklin develops an addiction to injectable stimulant drugs while trying to cope with the chronic stress and work overload in Medlab, and he takes a leave of absence from his position to recover. Executive Officer Susan Ivanova mentions that her father became an alcoholic after her mother's suicide. Captain Elizabeth Lochley tells Garibaldi that her father was an alcoholic, and that she is a recovering alcoholic herself. Influences. "Babylon 5" draws upon a number of cultural, historical, political and religious influences to inform and illustrate its characters and storylines. Straczynski has stated that there was no intent to wholly represent any particular period of history or preceding work of fiction, but has acknowledged their influence on the series, inasmuch as it uses similar well established storytelling structures, such as the hero's journey. There are a number of specific literary references. Several episodes take their titles from Shakespearean monologs, and at least one character quotes Shakespeare directly. The Psi-Cop Alfred Bester was named after the science fiction author of the same name, as his work influenced the autocratic Psi Corps organization the character represents. There are a number of references to the legend of King Arthur, with ships named "Excalibur" appearing in the main series and the "Crusade" spin-off, and a character in "A Late Delivery from Avalon" claiming to possess the sword itself. Straczynski links the incident which sparked the Earth-Minbari war, in which actions are misinterpreted during a tense situation, to a sequence in "Le Morte d'Arthur", in which a standoff between two armies turns violent when innocent actions are misinterpreted as hostile. The series also references contemporary and ancient history. The Centauri are in part modeled on the Roman empire. Emperor Cartagia believes himself to be a god, a deliberate reference to Caligula. His eventual assassination leads to the ascension of Londo and eventually Vir, both unlikely candidates for the throne, similar to Claudius' improbable ascension after Caligula was assassinated. The series also references the novel "I, Claudius" by Robert Graves when Cartagia jokes that he has cured a man of his cough after having him beheaded, something also done by Caligula. In the episode "In the Shadow of Z'ha'dhum," Sheridan ponders Winston Churchill's Coventry dilemma, of whether or not to act on covertly gathered intelligence during a war. Lives would be saved, but at the risk of revealing to the enemy that their intentions are known, which may be far more damaging in the long term. The swearing in of Vice President Morgan Clark invokes the assassination of President John F. Kennedy, being deliberately staged to mirror the scene aboard Air Force One when Lyndon Johnson was sworn in as President. Several episodes have titles referring to Biblical and/or Christian traditional narratives or texts, such as "Passing Through Gethsemane", "A Voice in the Wilderness," and "And the Rock Cried Out, No Hiding Place," the latter being a line from the gospel song "There's no Hiding Place Down Here." Use of the Internet. The show employed Internet marketing to create a buzz among online readers far in advance of the airing of the pilot episode, with Straczynski participating in online communities on USENET (in the rec.arts.sf.tv.babylon5.moderated newsgroup), and the GEnie and CompuServe systems before the Web came together as it exists today. The station's location, in "grid epsilon" at coordinates of 470/18/22, was a reference to GEnie ("grid epsilon" = "GE") and the original forum's address on the system's bulletin boards (page 470, category 18, topic 22). Also during this time, Warner Bros. executive Jim Moloshok created and distributed electronic trading cards to help advertise the series. In 1995, Warner Bros. started the Official "Babylon 5" Website on the now defunct Pathfinder portal. In September 1995, they hired series fan Troy Rutter to take over the site and move it to its own domain name, and to oversee the "Keyword B5" area on America Online. Reception. In 2004 and 2007, TV Guide ranked "Babylon 5" #13 and #16 on its list of the top cult shows ever. Awards. Awards presented to "Babylon 5" include: Nominated Awards include: "Star Trek: Deep Space Nine" and Paramount plagiarism controversy. Straczynski indicated that Paramount Television was aware of his concept as early as 1989, when he attempted to sell the show to the studio, and provided them with the series bible, pilot script, artwork, lengthy character background histories, and plot synopses for "22 or so" planned episodes taken from the overall course of the planned series. Paramount declined to produce "Babylon 5", but later announced "" was in development, two months after Warner Bros. announced its plans for "Babylon 5". Unlike previous "Star Trek" shows, "Deep Space Nine" was based on a space station, and had themes similar to those of "Babylon 5", which drew some to compare it with "Babylon 5". Straczynski stated that, even though he was confident that "Deep Space Nine" producer/creators Rick Berman and Michael Piller had not seen the material, he suspected that Paramount executives used his bible and scripts to steer development of "Deep Space Nine". Straczynski and Warner did not file suit against Paramount, largely because he believed it would negatively affect both series. He argued the same when confronted by claims that the lack of legal action was proof that his allegation was unfounded. According to a 2017 interview with Patricia Tallman, there was a legal case and an out-of-court settlement with Paramount. Influence and legacy. Generally viewed as having "launched the new era of television CGI visual effects", "Babylon 5" received multiple awards during its initial run, including two consecutive Hugo Awards for best dramatic presentation, and continues to regularly feature prominently in various polls and listings highlighting top-rated science fiction series. "Babylon 5" has been praised for its depth and complexity against a backdrop of contemporary shows which largely lacked long-term consequences, with plots typically being resolved in the course of a single episode, occasionally two. Straczynski was deeply involved in scriptwriting, writing 92 of 110 teleplays, a greater proportion than some of his contemporaries. Reviewers rated the quality of writing on a widely varying scale, identifying both eloquent soliloquies and dialogue that felt "stilted and theatrical." Straczynski has claimed that the multi-year story arc, now a feature of most mainstream televised drama, is the lasting legacy of the series. He stated that both Ronald D. Moore and Damon Lindelof used the 5-year narrative structure of "Babylon 5" as blueprints for their respective shows, "Battlestar Galactica" and "Lost". He also claims "Babylon 5" was the first series to be shot in the , and to use 5.1 channel sound mixes. It was an early example of widespread use of CGI rather than models for space scenes, which allowed for more freedom and larger scale in creating said scenes. While praised at the time, due to budgetary and mastering issues these sequences are considered to have aged poorly. A recurring theme among reviewers is that the series was more than the sum of its parts: while variously criticizing the writing, directing, acting and effects, particularly by comparison to current television productions, reviewers praised the consistency of plotting over the series' run, transcending the quality of its individual elements. Many retrospectives, while criticizing virtually every individual aspect of the production, have praised the series as a whole for its narrative cohesion and contribution to serialized television. Producer John Copeland said: "You're not really tuning in to watch the visual effects," instead, people are watching a two-decades-old show because "the storytelling does hold up" and "you wanna spend more time with the characters." DC began publishing "Babylon 5" comics in 1994, with stories (initially written by Straczynski) that closely tied in with events depicted in the show, with events in the comics eventually being referenced onscreen in the actual television series. The franchise continued to expand into short stories, RPGs, and novels, with the Technomage trilogy of books being the last to be published in 2001, shortly after the spin-off television series, "Crusade", was canceled. Excepting movie rights, which are retained by Straczynski, all production rights for the franchise are owned by Warner Bros. Media franchise. In November 1994, DC began publishing monthly "Babylon 5" comics. A number of short stories and novels were also produced between 1995 and 2001. Additional books were published by the gaming companies Chameleon Eclectic and Mongoose Publishing, to support their desk-top strategy and role-playing games. Three TV films were released by TNT in 1998, after funding a fifth season of "Babylon 5", following the demise of the Prime Time Entertainment Network the previous year. In addition to ', ', and ', they released a re-edited special edition of the original 1993 tv film, '. In 1999, a fifth TV film was also produced, "", which acted as a pilot movie for the spin-off series "Crusade", which TNT canceled after 13 episodes had been filmed. Dell Publishing started publication of a series of "Babylon 5" novels in 1995, which were ostensibly considered canon within the TV series' continuity, nominally supervised by Straczynski, with later novels in the line being more directly based upon Straczynski's own notes and story outlines. In 1997, Del Rey obtained the publication license from Warner Bros., and proceeded to release a number of original trilogies directly scenarized by Straczynski, as well as novelizations of three of the TNT telefilms ("In the Beginning, Thirdspace", and "A Call to Arms"). All of the Del Rey novels are considered completely canonical within the filmic "Babylon 5" universe. In 2000, the Sci-Fi Channel purchased the rights to rerun the "Babylon 5" series, and it premiered a new telefilm, ' in 2002, which failed to be picked up as a series. In 2007, the first in a planned anthology of straight-to-DVD short stories, entitled ', was released by Warner Home Video, but no others were produced, due to funding issues. Straczynski announced a "Babylon 5" film at the 2014 San Diego Comic-Con, but stated in 2016 that it had been delayed while he completed other productions. In 2018 Straczynski stated that although he possesses the movie rights, he believed that neither a film nor television series revival would happen while Warner Bros. retained the intellectual property for the TV series, believing that Warner Bros would insist on handling production, and that other studios would be hesitant to produce a film without also having the rights to the TV series. Reboot. A reboot of "Babylon 5" was announced in September 2021. It was to be produced by Straczynski through Studio JMS, and developed by Warner Bros. Television for The CW, before Warner took the rights back. Direct-to-video animated film. Straczynski revealed in May 2023 that he had been working in secret with Warner Bros. Animation to produce "", a full-length animated film. Most of the main surviving cast members voice their characters in the film, including Boxleitner, Christian, Jurasik, Mumy, Scoggins and Tallman. Other voice actors, replacing roles of those that had died, including Paul Guyet as Zathras and Jeffery Sinclair, Anthony Hansen as Michael Garibaldi, Mara Junot as Reporter and Computer Voice, Phil LaMarr as Dr. Stephen Franklin, Piotr Michael as David Sheridan, Andrew Morgado as G'Kar, and Rebecca Riedy as Delenn. The film premiered at Comic-Con in July 2023, and was made available on streaming services and physical media on August 15, 2023. Home media. In July 1995, Warner Home Video began distributing "Babylon 5" VHS video tapes under its Beyond Vision label in the UK. Beginning with the original telefilm, "", these were PAL tapes, showing video in the same 4:3 aspect ratio as the initial television broadcasts. By the release of Season 2, tapes included closed captioning of dialogue and Dolby Surround sound. Columbia House began distributing NTSC tapes, via mail order in 1997, followed by repackaged collector's editions and three-tape boxed sets in 1999, by which time the original pilot telefilm had been replaced by the re-edited TNT special edition. Additional movie and complete season boxed-sets were also released by Warner Bros. until 2000. Image Entertainment released "Babylon 5" laserdiscs between December 1998 and September 1999. Produced on double-sided 12-inch Pioneer discs, each contained two episodes displayed in the 4:3 broadcast aspect-ratio, with Dolby Surround audio and closed captioning for the dialogue. Starting with two TNT telefilms, "" and the re-edited special edition of "The Gathering", Seasons 1 and 5 were released simultaneously over a six-month period. Seasons 2 and 4 followed, but with the decision to halt production due to a drop in sales, precipitated by rumors of a pending DVD release, only the first twelve episodes of Season 2 and the first six episodes of Season 4 were ultimately released. In November 2001, Warner Home Video began distributing "Babylon 5" DVDs with a two-movie set containing the re-edited TNT special edition of "The Gathering" and "In The Beginning". The telefilms were later individually released in region 2 in April 2002, though some markets received the original version of "The Gathering" in identical packaging. DVD boxed sets of the individual seasons, each containing six discs, began being released in October 2002. Each included a printed booklet containing episode summaries, with each disc containing audio options for German, French, and English, plus subtitles in a wider range of languages, including Arabic and Dutch. Video was digitally remastered from higher resolution PAL broadcast masters and displayed in anamorphic widescreen with remastered and remixed Dolby Digital 5.1 sound. Disc 1 of each set contained an introduction to the season by Straczynski, while disc 6 included featurettes containing interviews with various production staff, as well as information on the fictional universe, and a gag reel. Three episodes in each season also contained commentary from either Straczynski, members of the main cast, and/or the episode director. Since its initial release, a number of repackaged DVD boxed sets have been produced for various regional markets. With slightly altered cover art, they included no additional content, but the discs were more securely stored in slimline cases, rather than the early "book" format, with hard plastic pages used during the original release of the first three seasons. On July 19, 2023, Warner announced that Babylon 5 would be released on Blu-ray on December 5, 2023, for the show's 30th anniversary. Mastering problems. While the series was in pre-production, studios were looking at ways for their existing shows to make the transition from the standard to the widescreen formats that would accompany the next generation of televisions. After visiting Warner Bros., which was stretching the horizontal interval for an episode of "", producer John Copeland convinced them to allow "Babylon 5" to be shot on Super 35mm film stock. "The idea being that we would telecine to 4:3 for the original broadcast of the series. But what it also gave us was a negative that had been shot for the new 16×9 widescreen-format televisions that we knew were on the horizon." Though the CG scenes, and those containing live action combined with digital elements, could have been created in a suitable widescreen format, a cost-saving decision was taken to produce them in the 4:3 aspect ratio. When those images were prepared for widescreen release, the top and bottom of the images were simply cropped, and the remaining image 'blown up' to match the dimensions of the live action footage, noticeably reducing the image quality. The scenes containing live action ready to be composited with matte paintings, CG animation, etc., were delivered on tape already telecined to the 4:3 aspect-ratio, and contained a high level of grain, which resulted in further image noise being present when enlarged and stretched for widescreen. For the purely live-action scenes, rather than using the film negatives, according to Copeland, "Warners had even forgotten that they had those. They used PAL versions and converted them to NTSC for the US market. They actually didn't go back and retransfer the shows." With the resulting aliasing, and the progressive scan transfer of the video to DVD, this has created a number of visual flaws throughout the widescreen release. In particular, quality has been noted to drop significantly in composite shots. In 2020 a new remastered version was created (as detailed above) which restored the original picture quality and repaired the damage to the CGI scenes, although this also involved reverting to the original 4:3 aspect ratio. The negatives were scanned at ultra HD quality and then down converted to HD and upscaling programs were used to enhance the CGI to HD at the same time.
4801
27823944
https://en.wikipedia.org/wiki?curid=4801
BeOS
BeOS is a discontinued operating system for personal computers that was developed by Be Inc. It was conceived for the company's BeBox personal computer which was released in 1995. BeOS was designed for multitasking, multithreading, and a graphical user interface. The OS was later sold to OEMs, retail, and directly to users; its last version was released as freeware. Early BeOS releases are for PowerPC. It was ported to Macintosh, then x86. Be was ultimately unable to achieve a significant market share and ended development with dwindling finances, so Palm acquired the BeOS assets in 2001. Enthusiasts have since created derivate operating systems including Haiku, which will retain BeOS 5 compatibility as of Release R1. Development. BeOS is the product of Apple Computer's former business executive Jean-Louis Gassée, with the underlying philosophy of building a "media OS" capable of up-and-coming digital media and multi-processors. Development began in the early 1990s, initially designed to run on AT&T Hobbit-based hardware before being modified to run on PowerPC-based processors: first Be's own BeBox system, and later Apple Computer's PowerPC Reference Platform and Common Hardware Reference Platform, with the hope that Apple would purchase or license BeOS as a replacement for its aging Mac OS. The first version of BeOS shipped with the BeBox to a limited number of developers in October 1995. It supported analog and digital audio and MIDI streams, multiple video sources, and 3D computation. Developer Release 6 (DR6) was the first officially available version. The BeOS Developer Release 7 (DR7) was released in April 1996. This includes full 32-bit color graphics, "workspaces" (virtual desktops), an FTP file server, and a web server. DR8 was released in September 1996 with a new browser with MPEG and QuickTime video formats. It supports OpenGL, remote access, and Power Macintosh. In 1996, Apple Computer CEO Gil Amelio started negotiations to buy Be Inc., but stalled when Be CEO Jean-Louis Gassée wanted $300 million and Apple offered $125 million. Apple's board of directors preferred NeXTSTEP and purchased Steve Jobs's NeXT instead. The final developer's release introduced a 64-bit file system. BeOS Preview Release (PR1), the first for the general public, was released in mid 1997. It supports AppleTalk, PostScript printing, and Unicode. The price for the Full Pack was $49.95. Later that year, Preview Release 2 shipped with support for Macintosh's Hierarchical File System (HFS), support for 512MB RAM, and improvements to the user interface. Release 3 (R3) shipped in March 1998 (initially $69.95, later $99.95), as the first to be ported to the Intel x86 platform in addition to PowerPC, and the first commercially available version of BeOS. The adoption of x86 was partly due to Apple's moves, with Steve Jobs stopping the Macintosh clone market, and Be's mounting debt. BeOS Release 4 had a claimed performance improvement of up to 30 percent. Keyboard shortcuts were changed to mimic those of Windows. However it still lacked Novell NetWare support. It also brought additional drivers and support for the most common SCSI controllers on the x86 platform - from Adaptec and Symbios Logic. The bootloader switched from LILO to Be's own bootman. In 2000, BeOS Release 5 (R5) was released. This was split between a Pro Edition, and a free version known as Personal Edition (BeOS PE) which was released for free online and by CD-ROM. BeOS PE could be booted from within Windows or Linux, and was intended as a consumer and developer preview. Also with R5, Be open sourced elements of the user interface. Be CEO Gassée said in 2001 that he was open to the idea of releasing the entire operating system's source code, but this never materialized. Release 5 raised BeOS's popularity but it remained commercially unsuccessful, and BeOS eventually halted following the introduction of a stripped-down version for Internet appliances, BeIA, which became the company's business focus in place of BeOS. R5 is the final official release of BeOS as Be Inc. became defunct in 2001 following its sale to Palm Inc. BeOS R5.1 "Dano", which was under development before Be's sale to Palm and includes the BeOS Networking Environment (BONE) networking stack, was leaked to the public shortly after the company's close. Hardware support and licensees. After the discontinuation of the BeBox in January 1997, Power Computing began bundling BeOS (on a CD-ROM for optional installation) with its line of PowerPC-based Macintosh clones. These systems can dual boot either Mac OS or BeOS, with a start-up screen offering the choice. Motorola also announced in February 1997 that it would bundle BeOS with their Macintosh clones, the Motorola StarMax, along with MacOS. DayStar Digital was another licensee. BeOS is compatible with many Macintosh models, but not PowerBook. With BeOS Release 3 on the x86 platform, the operating system is compatible with most computers that run Windows. Hitachi is the first major x86 OEM to ship BeOS, selling the Hitachi Flora Prius line in Japan, and Fujitsu released the Silverline computers in Germany and the Nordic countries. Be was unable to attract further manufacturers due to their Microsoft contracts. Be closed in 2002, and sued Microsoft, claiming that Hitachi had been dissuaded from selling PCs loaded with BeOS. The case was eventually settled out of court for $23.25 million with no admission of liability on Microsoft's part. Architecture. BeOS was developed as an original product, with a proprietary kernel, symmetric multiprocessing, preemptive multitasking, and pervasive multithreading. It runs in protected memory mode, with a C++ application framework based on shared libraries and modular code. Be initially offered CodeWarrior for application development, and later EGCS. Its API is object oriented. The user interface was largely multithreaded: each window ran in its own thread, relying heavily on sending messages to communicate between threads; and these concepts are reflected into the API. BeOS uses modern hardware facilities such as modular I/O bandwidth, a multithreaded graphics engine (with the OpenGL library), and a 64-bit journaling file system named BFS supporting files up to one terabyte each. BeOS has partial POSIX compatibility and a command-line interface through Bash, although internally it is not a Unix-derived operating system. Many Unix applications were ported to the BeOS command-line interface. BeOS uses Unicode as the default GUI encoding, and support for input methods such as bidirectional text input was never realized. Applications. BeOS is bundled with a unique web browser named NetPositive, the BeMail email client, and the PoorMan web server. Be operated the marketplace site BeDepot for the purchase and downloading of software including third party, and a website named BeWare listing apps for the platform. Some third party BeOS apps include the Gobe Productive office suite, the Mozilla project, and multimedia apps like Cinema 4D. "Quake" and "Quake II" were officially ported, and "SimCity 3000" was in development. Reception. Be did not disclose the number of BeOS users, but it was estimated to be running on between 50,000 and 100,000 computers in 1999, and Release 5 reportedly had over one million downloads. For a time it was viewed as a viable competitor to Mac OS and Windows, but its status as the "alternative operating system" was quickly surpassed by Linux by 1998. Reception of the operating system was largely positive citing its true and "reliable" multitasking and support for multiple processors. Though its market penetration was low, it gained a niche multimedia userbase and acceptance by the audio community. Consequently, it was styled as a "media OS" due to its well-regarded ability to handle audio and video. BeOS received significant interest in Japan, and was also appealing to Amiga developers and users, who were looking for a newer platform. BeOS and its successors have been used in media appliances, such as the Edirol DV-7 video editors from Roland Corporation, which run on a modified BeOS and the Tunetracker Radio Automation software that used to run it on BeOS and Zeta, and it was also sold as a "Station-in-a-Box" with the Zeta operating system included. In 2015, Tunetracker released a Haiku distribution bundled with its broadcasting software. Legacy. The Tascam SX-1 digital audio recorder runs a heavily modified version of BeOS that will only launch the recording interface software. The RADAR 24, RADAR V and RADAR 6, hard disk-based, 24-track professional audio recorders from iZ Technology Corporation were based on BeOS 5. Magicbox, a manufacturer of signage and broadcast display machines, uses BeOS to power their Aavelin product line. Final Scratch, a 12-inch vinyl timecode record-driven DJ software and hardware system, was first developed on BeOS. The "ProFS" version was sold to a few dozen DJs prior to the 1.0 release, which ran on a Linux virtual partition. Spiritual successors. After BeOS came to an end, Palm created PalmSource which used parts of BeOS's multimedia framework for its failed Palm OS Cobalt product (with the takeover of PalmSource, the BeOS rights were assigned to Access Co.). However, Palm refused the request of BeOS users to license the operating system. As a result, a few projects formed to recreate BeOS or its key elements with the eventual goal of then continuing where Be Inc. quit. BeUnited, a BeOS oriented community, converted itself into a nonprofit organization in August 2001 to "define and promote open specifications for the delivery of the Open Standards BeOS-compatible Operating System (OSBOS) platform". ZETA. Immediately after Palm's purchase of Be, a German company named yellowTAB started developing Zeta based on the BeOS R5.1 codebase and released it commercially. It was later distributed by magnussoft. During development by yellowTAB, the company received criticism from the BeOS community for refusing to discuss its legal position with regard to the BeOS codebase. Access Co. (which bought PalmSource, until then the holder of the intellectual property associated with BeOS) declared that yellowTAB had no right to distribute a modified version of BeOS, and magnussoft was forced to cease distribution of the operating system in 2007. Haiku (OpenBeOS). Haiku is a complete open source reimplementation of BeOS. It was originally named OpenBeOS and its first release in 2002 was a community update. Unlike Cosmoe and BlueEyedOS, it is directly compatible with BeOS applications. It is open source software. As of 2024, it was the only BeOS clone still under development, with the fifth beta in September 2024 still keeping BeOS 5 compatibility in its x86 32-bit images, with an increased number of ported modern drivers and GTK apps. Others. BlueEyedOS tried to create a system under LGPL based on the Linux kernel and an X server that is compatible with BeOS. Work began under the name BlueOS in 2001 and a demo CD was released in 2003. The project was discontinued in February 2005. Cosmoe, with an interface like BeOS, was designed by Bill Hayden as an open source operating system based on the source code of AtheOS and later OpenBeOS, but using the Linux kernel. ZevenOS was designed to continue where Cosmoe left off. In mid 2024, Cosmoe was resurrected by its original author after 17 years, with a much improved codebase based on contemporary Haiku. BeFree started in 2003, initially developed under FreeBSD and later Linux.
4802
982329
https://en.wikipedia.org/wiki?curid=4802
Biome
A biome () is a distinct geographical region with specific climate, vegetation, and animal life. It consists of a biological community that has formed in response to its physical environment and regional climate. In 1935, Tansley added the climatic and soil aspects to the idea, calling it "ecosystem". The International Biological Program (1964–74) projects popularized the concept of biome. However, in some contexts, the term "biome" is used in a different manner. In German literature, particularly in the Walter terminology, the term is used similarly as "biotope" (a concrete geographical unit), while the biome definition used in this article is used as an international, non-regional, terminology—irrespectively of the continent in which an area is present, it takes the same biome name—and corresponds to his "zonobiome", "orobiome" and "pedobiome" (biomes determined by climate zone, altitude or soil). In the Brazilian literature, the term "biome" is sometimes used as a synonym of "biogeographic province", an area based on species composition (the term "floristic province" being used when plant species are considered), or also as synonym of the "morphoclimatic and phytogeographical domain" of Ab'Sáber, a geographic space with subcontinental dimensions, with the predominance of similar geomorphologic and climatic characteristics, and of a certain vegetation form. Both include many biomes in fact. Classifications. To divide the world into a few ecological zones is difficult, notably because of the small-scale variations that exist everywhere on earth and because of the gradual changeover from one biome to the other. Their boundaries must therefore be drawn arbitrarily and their characterization made according to the average conditions that predominate in them. A 1978 study on North American grasslands found a positive logistic correlation between evapotranspiration in mm/yr and above-ground net primary production in g/m2/yr. The general results from the study were that precipitation and water use led to above-ground primary production, while solar irradiation and temperature lead to below-ground primary production (roots), and temperature and water lead to cool and warm season growth habit. These findings help explain the categories used in Holdridge's bioclassification scheme (see below), which were then later simplified by Whittaker. The number of classification schemes and the variety of determinants used in those schemes, however, should be taken as strong indicators that biomes do not fit perfectly into the classification schemes created. Holdridge (1947, 1964) life zones. In 1947, the American botanist and climatologist Leslie Holdridge classified climates based on the biological effects of temperature and rainfall on vegetation under the assumption that these two abiotic factors are the largest determinants of the types of vegetation found in a habitat. Holdridge uses the four axes to define 30 so-called "humidity provinces", which are clearly visible in his diagram. While this scheme largely ignores soil and sun exposure, Holdridge acknowledged that these were important. Allee (1949) biome-types. The principal biome-types by Allee (1949): Kendeigh (1961) biomes. The principal biomes of the world by Kendeigh (1961): Whittaker (1962, 1970, 1975) biome-types. Whittaker classified biomes using two abiotic factors: precipitation and temperature. His scheme can be seen as a simplification of Holdridge's; more readily accessible, but missing Holdridge's greater specificity. Whittaker based his approach on theoretical assertions and empirical sampling. He had previously compiled a review of biome classifications. Key definitions for understanding Whittaker's scheme. Whittaker's distinction between biome and formation can be simplified: formation is used when applied to plant communities only, while biome is used when concerned with both plants and animals. Whittaker's convention of biome-type or formation-type is a broader method to categorize similar communities. Whittaker's parameters for classifying biome-types. Whittaker used what he called "gradient analysis" of ecocline patterns to relate communities to climate on a worldwide scale. Whittaker considered four main ecoclines in the terrestrial realm. Along these gradients, Whittaker noted several trends that allowed him to qualitatively establish biome-types: Whittaker summed the effects of gradients (3) and (4) to get an overall temperature gradient and combined this with a gradient (2), the moisture gradient, to express the above conclusions in what is known as the Whittaker classification scheme. The scheme graphs average annual precipitation (x-axis) versus average annual temperature (y-axis) to classify biome-types. Goodall (1974–) ecosystem types. The multi-authored series "Ecosystems of the World", edited by David W. Goodall, provides a comprehensive coverage of the major "ecosystem types or biomes" on Earth: Walter (1976, 2002) zonobiomes. The eponymously named Heinrich Walter classification scheme considers the seasonality of temperature and precipitation. The system, also assessing precipitation and temperature, finds nine major biome types, with the important climate traits and vegetation types. The boundaries of each biome correlate to the conditions of moisture and cold stress that are strong determinants of plant form, and therefore the vegetation that defines the region. Extreme conditions, such as flooding in a swamp, can create different kinds of communities within the same biome. Schultz (1988) eco-zones. Schultz (1988, 2005) defined nine "ecozones" (his concept of ecozone is more similar to the concept of biome than to the concept of ecozone of BBC): Bailey (1989) ecoregions. Robert G. Bailey nearly developed a biogeographical classification system of ecoregions for the United States in a map published in 1976. He subsequently expanded the system to include the rest of North America in 1981, and the world in 1989. The Bailey system, based on climate, is divided into four domains (polar, humid temperate, dry, and humid tropical), with further divisions based on other climate characteristics (subarctic, warm temperate, hot temperate, and subtropical; marine and continental; lowland and mountain). Olson & Dinerstein (1998) biomes for WWF / Global 200. A team of biologists convened by the World Wildlife Fund (WWF) developed a scheme that divided the world's land area into biogeographic realms (called "ecozones" in a BBC scheme), and these into ecoregions (Olson & Dinerstein, 1998, etc.). Each ecoregion is characterized by a main biome (also called major habitat type). This classification is used to define the Global 200 list of ecoregions identified by the WWF as priorities for conservation. For the terrestrial ecoregions, there is a specific EcoID, format XXnnNN (XX is the biogeographic realm, nn is the biome number, NN is the individual number). Biogeographic realms (terrestrial and freshwater). The applicability of the realms scheme above – based on Udvardy (1975)—to most freshwater taxa is unresolved. Biomes (freshwater). According to the WWF, the following are classified as freshwater biomes: Biomes (marine). Biomes of the coastal and continental shelf areas (neritic zone): Summary of the scheme. Example: Other biomes. Marine biomes. Pruvot (1896) zones or "systems": Longhurst (1998) biomes: Other marine habitat types (not covered yet by the Global 200/WWF scheme): Anthropogenic biomes. Humans have altered global patterns of biodiversity and ecosystem processes. As a result, vegetation forms predicted by conventional biome systems can no longer be observed across much of Earth's land surface as they have been replaced by crops and rangelands or cities. Anthropogenic biomes provide an alternative view of the terrestrial biosphere based on global patterns of sustained direct human interaction with ecosystems, including agriculture, human settlements, urbanization, forestry and other uses of land. Anthropogenic biomes offer a way to recognize the irreversible coupling of human and ecological systems at global scales and manage Earth's biosphere and anthropogenic biomes. Major anthropogenic biomes: Microbial biomes. Endolithic biomes. The endolithic biome, consisting entirely of microscopic life in rock pores and cracks, kilometers beneath the surface, has only recently been discovered, and does not fit well into most classification schemes. Effects of climate change. Anthropogenic climate change has the potential to greatly alter the distribution of Earth's biomes. Meaning, biomes around the world could change so much that they would be at risk of becoming new biomes entirely. More specifically, between 54% and 22% of global land area will experience climates that correspond to other biomes. 3.6% of land area will experience climates that are completely new or unusual. An example of a biome shift is woody plant encroachment, which can change grass savanna into shrub savanna. Average temperatures have risen more than twice the usual amount in both arctic and mountainous biomes, which leads to the conclusion that arctic and mountainous biomes are currently the most vulnerable to climate change. South American terrestrial biomes have been predicted to go through the same temperature trends as arctic and mountainous biomes. With its annual average temperature continuing to increase, the moisture currently located in forest biomes will dry up.
4805
28032115
https://en.wikipedia.org/wiki?curid=4805
Behavior
Behavior (American English) or behaviour (British English) is the range of actions of individuals, organisms, systems or artificial entities in some environment. These systems can include other systems or organisms as well as the inanimate physical environment. It is the computed response of the system or organism to various stimuli or inputs, whether internal or external, conscious or subconscious, overt or covert, and voluntary or involuntary. While some behavior is produced in response to an organism's environment (extrinsic motivation), behavior can also be the product of intrinsic motivation, also referred to as "agency" or "free will". Taking a behavior informatics perspective, a behavior consists of actor, operation, interactions, and their properties. This can be represented as a behavior vector. Models. Biology. Definition. Behavior may be defined as "the internally coordinated responses (actions or inactions) of whole living organisms (individuals or groups) to internal or external stimuli". A broader definition of behavior, applicable to plants and other organisms, is similar to the concept of phenotypic plasticity. It describes behavior as a response to an event or environment change during the course of the lifetime of an individual, differing from other physiological or biochemical changes that occur more rapidly, and excluding changes that are a result of development (ontogeny). Behaviour can be regarded as any action of an organism that changes its relationship to its environment. Behavior provides outputs from the organism to the environment. Determination by genetics or the environment. Behaviors can be either innate or learned from the environment, or both, dependent on the organism. The more complex nervous systems (or brains) are, the more influence learning has on behavior. However, even in mammals, a large fraction of behavior is genetically determined. For instance, prairie voles tend to be monogamous while, while meadow voles are more promiscuous, a difference that is strongly determined by a single gene, Avpr1a, encoding a receptor for the peptide hormone Vasopressin. Human behavior. The endocrine system and the nervous system likely influence human behavior. Complexity in the behavior of an organism may be correlated to the complexity of its nervous system. Generally, organisms with more complex nervous systems have a greater capacity to learn new responses and thus adjust their behavior. Consumer behaviour is the behavior of humans when they act or treated as consumers. Animal behavior. Ethology is the scientific and objective study of animal behavior, usually with a focus on behavior under natural conditions, and viewing behavior as an evolutionarily adaptive trait. Behaviorism is a term that also describes the scientific and objective study of animal behavior, usually referring to measured responses to stimuli or trained behavioral responses in a laboratory context, without a particular emphasis on evolutionary adaptivity. In management. Organizational. In management, behaviors are associated with desired or undesired focuses. Managers generally note what the desired outcome is, but behavioral patterns can take over. These patterns are the reference to how often the desired behavior actually occurs. Before a behavior actually occurs, antecedents focus on the stimuli that influence the behavior that is about to happen. After the behavior occurs, consequences fall into place. Consequences consist of rewards or punishments. Social behavior. Social behavior is behavior among two or more organisms within the same species, and encompasses any behavior in which one member affects the other. This is due to an interaction among those members. Social behavior can be seen as similar to an exchange of goods, with the expectation that when one gives, one will receive the same. This behavior can be affected by both the qualities of the individual and the environmental (situational) factors. Therefore, social behavior arises as a result of an interaction between the two—the organism and its environment. This means that, in regards to humans, social behavior can be determined by both the individual characteristics of the person, and the situation they are in. Behavior informatics. Behavior informatics also called behavior computing, explores behavior intelligence and behavior insights from the informatics and computing perspectives. Different from applied behavior analysis from the psychological perspective, BI builds computational theories, systems and tools to qualitatively and quantitatively model, represent, analyze, and manage behaviors of individuals, groups and/or organizations. Health. Health behavior refers to a person's beliefs and actions regarding their health and well-being. Health behaviors are direct factors in maintaining a healthy lifestyle. Health behaviors are influenced by the social, cultural, and physical environments in which we live. They are shaped by individual choices and external constraints. Positive behaviors help promote health and prevent disease, while the opposite is true for risk behaviors. Health behaviors are early indicators of population health. Because of the time lag that often occurs between certain behaviors and the development of disease, these indicators may foreshadow the future burdens and benefits of health-risk and health-promoting behaviors. Correlates. A variety of studies have examined the relationship between health behaviors and health outcomes (e.g., Blaxter 1990) and have demonstrated their role in both morbidity and mortality. These studies have identified seven features of lifestyle which were associated with lower morbidity and higher subsequent long-term survival (Belloc and Breslow 1972): Health behaviors impact upon individuals' quality of life, by delaying the onset of chronic disease and extending active lifespan. Smoking, alcohol consumption, diet, gaps in primary care services and low screening uptake are all significant determinants of poor health, and changing such behaviors should lead to improved health. For example, in US, Healthy People 2000, United States Department of Health and Human Services, lists increased physical activity, changes in nutrition and reductions in tobacco, alcohol and drug use as important for health promotion and disease prevention. Treatment approach. Any interventions done are matched with the needs of each individual in an ethical and respected manner. Health belief model encourages increasing individuals' perceived susceptibility to negative health outcomes and making individuals aware of the severity of such negative health behavior outcomes. E.g. through health promotion messages. In addition, the health belief model suggests the need to focus on the benefits of health behaviors and the fact that barriers to action are easily overcome. The theory of planned behavior suggests using persuasive messages for tackling behavioral beliefs to increase the readiness to perform a behavior, called "intentions". The theory of planned behavior advocates the need to tackle normative beliefs and control beliefs in any attempt to change behavior. Challenging the normative beliefs is not enough but to follow through the "intention" with self-efficacy from individual's mastery in problem solving and task completion is important to bring about a positive change. Self efficacy is often cemented through standard persuasive techniques.
4806
37194006
https://en.wikipedia.org/wiki?curid=4806
Battle of Marathon
The Battle of Marathon took place in 490 BC during the first Persian invasion of Greece. It was fought between the citizens of Athens, aided by Plataea, and a Persian force commanded by Datis and Artaphernes. The battle was the culmination of the first attempt by Persia under King Darius I to subjugate Greece. The Greek army inflicted a crushing defeat on the more numerous Persians, marking a turning point in the Greco-Persian Wars. The first Persian invasion was a response to Athenian involvement in the Ionian Revolt, when the city-states of Athens and Eretria each sent a force to support the cities of Ionia in their attempt to overthrow Persian rule. The Athenians and Eretrians had succeeded in capturing and burning Sardis, but they were then forced to retreat with heavy losses. In response to this raid, Darius swore to burn down the two cities that had aided the failed revolt. According to Herodotus, Darius had his bow brought to him and then shot an arrow "upwards towards heaven", saying as he did so: "Zeus, that it may be granted me to take vengeance upon the Athenians!" Herodotus further writes that Darius charged one of his servants to say "Master, remember the Athenians" three times before dinner each day. At the time of the battle, Sparta and Athens were the two largest city-states in Greece. Once the Ionian revolt was finally crushed by the Persian victory at the Battle of Lade in 494 BC, Darius began making plans to subjugate Greece. In 490 BC, he sent a naval task force under Datis and Artaphernes across the Aegean Sea, to subjugate the Cyclades islands and then to make punitive attacks on Athens and Eretria. Reaching the island of Euboea in mid-summer after a successful campaign in the Aegean, the Persians proceeded to besiege and capture Eretria. The Persian force then sailed for Attica, landing in the bay near the town of Marathon. The Athenians, joined by a small force from Plataea, marched to Marathon, and succeeded in blocking the two exits from the plain of Marathon. The Athenians also sent a message to the Spartans asking for support. When the messenger arrived in Sparta, the Spartans were already involved in a religious festival and gave this as a reason for not coming to help the Athenians. The Athenians and their allies chose a location for the battle, with marshes and mountainous terrain, that prevented the Persian cavalry from joining the Persian infantry. Miltiades, the Athenian general, ordered a general attack against the Persian forces, composed primarily of missile troops. He reinforced his flanks, luring the Persians' best fighters into his center. The inward wheeling flanks enveloped the Persians, routing them. The Persian army broke in panic towards their ships, and large numbers were slaughtered. The defeat at Marathon marked the end of the first Persian invasion of Greece, and the Persian force retreated to Asia. Darius then began raising a huge new army with which he meant to completely subjugate Greece; however, in 486 BC, his Egyptian subjects revolted, indefinitely postponing any Greek expedition. After Darius died, his son Xerxes I restarted the preparations for a second invasion of Greece, which finally began in 480 BC. The Battle of Marathon was a watershed in the Greco-Persian wars, showing the Greeks that the Persians could be beaten; the eventual Greek triumph in these wars can be seen to have begun at Marathon. The battle also showed the Greeks that they were able to win battles without the Spartans, as Sparta was seen as the major military force in Greece. This victory was overwhelmingly won by the Athenians, and Marathon raised Greek esteem of them. The following two hundred years saw the rise of the Classical Greek civilization, which has been enduringly influential in Western society, and so the Battle of Marathon is often seen as a pivotal moment in Mediterranean and European history, and is often celebrated today. Background. The first Persian invasion of Greece had its immediate roots in the Ionian Revolt, the earliest phase of the Greco-Persian Wars. However, it was also the result of the longer-term interaction between the Greeks and Persians. In 500 BC the Persian Empire was still relatively young and highly expansionistic, but prone to revolts amongst its subject peoples. Moreover, the Persian King Darius was a usurper, and had spent considerable time extinguishing revolts against his rule. Even before the Ionian Revolt, Darius had begun to expand the empire into Europe, subjugating Thrace, and forcing Macedon to become a vassal of Persia. Attempts at further expansion into the politically fractious world of ancient Greece may have been inevitable. However, the Ionian Revolt had directly threatened the integrity of the Persian empire, and the states of mainland Greece remained a potential menace to its future stability. Darius thus resolved to subjugate and pacify Greece and the Aegean, and to punish those involved in the Ionian Revolt. The Ionian Revolt had begun with an unsuccessful expedition against Naxos, a joint venture between the Persian satrap Artaphernes and the Milesian tyrant Aristagoras. In the aftermath, Artaphernes decided to remove Aristagoras from power, but before he could do so, Aristagoras abdicated, and declared Miletus a democracy. The other Ionian cities followed suit, ejecting their Persian-appointed tyrants, and declaring themselves democracies. Aristagoras then appealed to the states of mainland Greece for support, but only Athens and Eretria offered to send troops. The involvement of Athens in the Ionian Revolt arose from a complex set of circumstances, beginning with the establishment of the Athenian Democracy in the late 6th century BC. In 510 BC, with the aid of Cleomenes I, King of Sparta, the Athenian people had expelled Hippias, the tyrant ruler of Athens. With Hippias's father Peisistratus, the family had ruled for 36 out of the previous 50 years and fully intended to continue Hippias's rule. Hippias fled to Sardis to the court of the Persian satrap, Artaphernes and promised control of Athens to the Persians if they were to help restore him. In the meantime, Cleomenes helped install a pro-Spartan tyranny under Isagoras in Athens, in opposition to Cleisthenes, the leader of the traditionally powerful Alcmaeonidae family, who considered themselves the natural heirs to the rule of Athens. Cleisthenes, however, found himself being politically defeated by a coalition led by Isagoras and decided to change the rules of the game by appealing to the "demos" (the people), in effect making them a new faction in the political arena. This tactic succeeded, but the Spartan King, Cleomenes I, returned at the request of Isagoras and so Cleisthenes, the Alcmaeonids and other prominent Athenian families were exiled from Athens. When Isagoras attempted to create a narrow oligarchic government, the Athenian people, in a spontaneous and unprecedented move, expelled Cleomenes and Isagoras. Cleisthenes was thus restored to Athens (507 BC), and at breakneck speed began to reform the state with the aim of securing his position. The result was not actually a democracy or a real civic state, but he enabled the development of a fully democratic government, which would emerge in the next generation as the demos realized its power. The new-found freedom and self-governance of the Athenians meant that they were thereafter exceptionally hostile to the return of the tyranny of Hippias, or any form of outside subjugation, by Sparta, Persia, or anyone else. Cleomenes was not pleased with events, and marched on Athens with the Spartan army. Cleomenes's attempts to restore Isagoras to Athens ended in a debacle, but fearing the worst, the Athenians had by this point already sent an embassy to Artaphernes in Sardis, to request aid from the Persian empire. Artaphernes requested that the Athenians give him an 'earth and water', a traditional token of submission, to which the Athenian ambassadors acquiesced. They were, however, severely censured for this when they returned to Athens. At some later point Cleomenes instigated a plot to restore Hippias to the rule of Athens. This failed and Hippias again fled to Sardis and tried to persuade the Persians to subjugate Athens. The Athenians dispatched ambassadors to Artaphernes to dissuade him from taking action, but Artaphernes merely instructed the Athenians to take Hippias back as tyrant. The Athenians indignantly declined, and instead resolved to open war with Persia. Having thus become the enemy of Persia, Athens was already in a position to support the Ionian cities when they began their revolt. The fact that the Ionian democracies were inspired by the example the Athenians had set no doubt further persuaded the Athenians to support the Ionian Revolt, especially since the cities of Ionia were originally Athenian colonies. The Athenians and Eretrians sent a task force of 25 triremes to Asia Minor to aid the revolt. Whilst there, the Greek army surprised and outmaneuvered Artaphernes, marching to Sardis and burning the lower city. This was, however, as much as the Greeks achieved, and they were then repelled and pursued back to the coast by Persian horsemen, losing many men in the process. Despite the fact that their actions were ultimately fruitless, the Eretrians and in particular the Athenians had earned Darius's lasting enmity, and he vowed to punish both cities. The Persian naval victory at the Battle of Lade (494 BC) all but ended the Ionian Revolt, and by 493 BC, the last hold-outs were vanquished by the Persian fleet. The revolt was used as an opportunity by Darius to extend the empire's border to the islands of the eastern Aegean and the Propontis, which had not been part of the Persian dominions before. The pacification of Ionia allowed the Persians to begin planning their next moves; to extinguish the threat to the empire from Greece and to punish Athens and Eretria. In 492 BC, after the Ionian Revolt had finally been crushed, Darius dispatched an to Greece under the command of his son-in-law, Mardonius. Mardonius re-subjugated Thrace and made Macedonia fully subordinate to the Persians; it had been a vassal of the Persians since the late 6th century BC, but retained its general autonomy. Not long after, however, his fleet was wrecked by a violent storm, which brought a premature end to the campaign. However, in 490 BC, following the successes of the previous campaign, Darius decided to send a maritime expedition led by Artaphernes (son of the satrap to whom Hippias had fled) and Datis, a Median admiral. Mardonius had been injured in the prior campaign and had fallen out of favor. The was intended to bring the Cyclades into the Persian empire, to punish Naxos (which had resisted a Persian assault in 499 BC) and then to head to Greece to force Eretria and Athens to submit to Darius or be destroyed. After island-hopping across the Aegean, including successfully attacking Naxos, the Persian force arrived off Euboea in mid summer. The Persians then proceeded to besiege, capture, and burn Eretria. They then headed south down the coast of Attica, to complete the final objective of the campaign—punish Athens. Prelude. The Persians sailed down the coast of Attica, and landed at the bay of Marathon, about northeast of Athens, on the advice of the exiled Athenian tyrant Hippias (who had accompanied the expedition). Under the guidance of Miltiades, the Athenian general with the greatest experience of fighting the Persians, the Athenian army marched quickly to block the two exits from the plain of Marathon, and prevent the Persians moving inland. At the same time, Athens's greatest runner, Pheidippides (or Philippides in some accounts) had been sent to Sparta to request that the Spartan army march to the aid of Athens. Pheidippides arrived during the festival of Carneia, a sacrosanct period of peace, and was informed that the Spartan army could not march to war until the full moon rose; Athens could not expect reinforcement for at least ten days. The Athenians would have to hold out at Marathon for the time being, although they were reinforced by the full muster of 1,000 hoplites from the small city of Plataea, a gesture which did much to steady the nerves of the Athenians and won unending Athenian gratitude to Plataea. For approximately five days the armies therefore confronted each other across the plain of Marathon in stalemate. The flanks of the Athenian camp were protected by either a grove of trees or an "abbatis" of stakes (depending on the exact reading). Since every day brought the arrival of the Spartans closer, the delay worked in favor of the Athenians. There were ten Athenian "strategoi" (generals) at Marathon, elected by each of the ten tribes that the Athenians were divided into; Miltiades was one of these. In addition, in overall charge, was the War-Archon (polemarch), Callimachus, who had been elected by the whole citizen body. Herodotus suggests that command rotated between the "strategoi", each taking in turn a day to command the army. He further suggests that each "strategos", on his day in command, instead deferred to Miltiades. In Herodotus's account, Miltiades is keen to attack the Persians (despite knowing that the Spartans are coming to aid the Athenians), but strangely, chooses to wait until his actual day of command to attack. This passage is undoubtedly problematic; the Athenians had little to gain by attacking before the Spartans arrived, and there is no real evidence of this rotating generalship. There does, however, seem to have been a delay between the Athenian arrival at Marathon and the battle; Herodotus, who evidently believed that Miltiades was eager to attack, may have made a mistake while seeking to explain this delay. As is discussed below, the reason for the delay was probably simply that neither the Athenians nor the Persians were willing to risk battle initially. This then raises the question of why the battle occurred when it did. Herodotus explicitly tells us that the Greeks attacked the Persians (and the other sources confirm this), but it is not clear why they did this before the arrival of the Spartans. There are two main theories to explain this. The first theory is that the Persian cavalry left Marathon for an unspecified reason, and that the Greeks moved to take advantage of this by attacking. This theory is based on the absence of any mention of cavalry in Herodotus' account of the battle, and an entry in the "Suda" dictionary. The entry "χωρίς ἱππέων" ("without cavalry") is explained thus: The cavalry left. When Datis surrendered and was ready for retreat, the Ionians climbed the trees and gave the Athenians the signal that the cavalry had left. And when Miltiades realized that, he attacked and thus won. From there comes the above-mentioned quote, which is used when someone breaks ranks before battle. There are many variations of this theory, but perhaps the most prevalent is that the cavalry were completing the time-consuming process of re-embarking on the ships, and were to be sent by sea to attack (undefended) Athens in the rear, whilst the rest of the Persians pinned down the Athenian army at Marathon. This theory therefore utilises Herodotus' suggestion that after Marathon, the Persian army began to re-embark, intending to sail around Cape Sounion to attack Athens directly. Thus, this re-embarcation would have occurred "before" the battle (and indeed have triggered the battle). The second theory is simply that the battle occurred because the Persians finally moved to attack the Athenians. Although this theory has the Persians moving to the "strategic" offensive, this can be reconciled with the traditional account of the Athenians attacking the Persians by assuming that, seeing the Persians advancing, the Athenians took the "tactical" offensive, and attacked them. Obviously, it cannot be firmly established which theory (if either) is correct. However, both theories imply that there was some kind of Persian activity which occurred on or about the fifth day which ultimately triggered the battle. It is also possible that both theories are correct: when the Persians sent the cavalry by ship to attack Athens, they simultaneously sent their infantry to attack at Marathon, triggering the Greek counterattack. Date of the battle. Herodotus mentions for several events a date in the lunisolar calendar, of which each Greek city-state used a variant. Astronomical computation allows us to derive an absolute date in the proleptic Julian calendar which is much used by historians as the chronological frame. Philipp August Böckh in 1855 concluded that the battle took place on September 12, 490 BC in the Julian calendar, and this is the conventionally accepted date. However, this depends on when exactly the Spartans held their festival and it is possible that the Spartan calendar was one month ahead of that of Athens. In that case the battle took place on August 12, 490 BC. Opposing forces. Athenians. Herodotus does not give a figure for the size of the Athenian army. However, Cornelius Nepos, Pausanias and Plutarch all give the figure of 9,000 Athenians and 1,000 Plataeans; while Justin suggests that there were 10,000 Athenians and 1,000 Plataeans. These numbers are highly comparable to the number of troops Herodotus says that the Athenians and Plataeans sent to the Battle of Plataea 11 years later. Pausanias noticed on the monument to the battle the names of former slaves who were freed in exchange for military services. Modern historians generally accept these numbers as reasonable. The areas ruled by Athens (Attica) had a population of 315,000 at this time including slaves, which implies the full Athenian army at the times of both Marathon and Plataea numbered about 3% of the population. Persians. According to Herodotus, the fleet sent by Darius consisted of 600 triremes. Herodotus does not estimate the size of the Persian army, only saying that they were a "large infantry that was well packed". Among ancient sources, the poet Simonides, another near-contemporary, says the campaign force numbered 200,000; while a later writer, the Roman Cornelius Nepos estimates 200,000 infantry and 10,000 cavalry, of which only 100,000 fought in the battle, while the rest were loaded into the fleet that was rounding Cape Sounion; Plutarch and Pausanias both independently give 300,000, as does the Suda dictionary. Plato and Lysias give 500,000; and Justinus 600,000. Modern historians have proposed wide-ranging numbers for the infantry, from 20,000 to 100,000 with a consensus of perhaps 25,000; estimates for the cavalry are in the range of 1,000. The fleet included various contingents from different parts of the Achaemenid Empire, particularly Ionians and Aeolians, although they are not mentioned as participating directly to the battle and may have remained on the ships: Regarding the ethnicities involved in the battle, Herodotus specifically mentions the presence of the Persians and the Sakae at the center of the Achaemenid line: Strategic and tactical considerations. From a strategic point of view, the Athenians had some disadvantages at Marathon. In order to face the Persians in battle, the Athenians had to summon all available hoplites; even then they were still probably outnumbered at least 2 to 1. Furthermore, raising such a large army had denuded Athens of defenders, and thus any secondary attack in the Athenian rear would cut the army off from the city; and any direct attack on the city could not be defended against. Still further, defeat at Marathon would mean the complete defeat of Athens, since no other Athenian army existed. The Athenian strategy was therefore to keep the Persian army pinned down at Marathon, blocking both exits from the plain, and thus preventing themselves from being outmaneuvered. However, these disadvantages were balanced by some advantages. The Athenians initially had no need to seek battle, since they had managed to confine the Persians to the plain of Marathon. Furthermore, time worked in their favour, as every day brought the arrival of the Spartans closer. Having everything to lose by attacking, and much to gain by waiting, the Athenians remained on the defensive in the run up to the battle. Tactically, hoplites were vulnerable to attacks by cavalry, and since the Persians had substantial numbers of cavalry, this made any offensive maneuver by the Athenians even more of a risk, and thus reinforced the defensive strategy of the Athenians. The Persian strategy, in contrast, was probably principally determined by tactical considerations. The Persian infantry was evidently lightly armoured, and no match for hoplites in a head-on confrontation (as would be demonstrated at the later battles of Thermopylae and Plataea.) Since the Athenians seem to have taken up a strong defensive position at Marathon, the Persian hesitance was probably a reluctance to attack the Athenians head-on. The camp of the Athenians was located on a spur of mount Agrieliki next to the plain of Marathon; remains of its fortifications are still visible. Whatever event eventually triggered the battle, it obviously altered the strategic or tactical balance sufficiently to induce the Athenians to attack the Persians. If the first theory is correct (see above), then the absence of cavalry removed the main Athenian tactical disadvantage, and the threat of being outflanked made it imperative to attack. But if the second theory is correct, then the Athenians were merely reacting to the Persians attacking them. Since the Persian force obviously contained a high proportion of missile troops, a static defensive position would have made little sense for the Athenians; the strength of the hoplite was in the melee, and the sooner that could be brought about, the better, from the Athenian point of view. If the second theory is correct, this raises the further question of why the Persians, having hesitated for several days, then attacked. There may have been several strategic reasons for this; perhaps they were aware (or suspected) that the Athenians were expecting reinforcements. Alternatively, they may have felt the need to force some kind of victory—they could hardly remain at Marathon indefinitely. Battle. First phase: the two armies form their lines. The distance between the two armies at the point of battle had narrowed to "a distance not less than 8 stadia" or about 1,500 meters. Miltiades ordered the two tribes forming the center of the Greek formation, the Leontis tribe led by Themistocles and the Antiochis tribe led by Aristides, to be arranged in the depth of four ranks while the rest of the tribes at their flanks were in ranks of eight. Some modern commentators have suggested this was a deliberate ploy to encourage a double envelopment of the Persian centre. However, this suggests a level of training that the Greeks are thought not to have possessed. There is little evidence for any such tactical thinking in Greek battles until Leuctra in 371 BC. It is therefore possible that this arrangement was made, perhaps at the last moment, so that the Athenian line was as long as the Persian line, and would not therefore be outflanked. Second phase: the Greeks attack and the lines make contact. When the Athenian line was ready, according to one source, the simple signal to advance was given by Miltiades: "At them". Herodotus implies the Athenians ran the whole distance to the Persian lines, though this is generally thought to be impossible under the weight of hoplite armory. More likely, they marched until they reached the limit of the archers' effectiveness, the "beaten zone" (roughly 200 meters), and then broke into a run towards their enemy. Another possibility is that they ran "up to" the 200 meter-mark in broken ranks, and then reformed for the march into battle from there. Herodotus suggests that this was the first time a Greek army ran into battle in this way; this was probably because it was the first time that a Greek army had faced an enemy composed primarily of missile troops. All this was evidently much to the surprise of the Persians; "... in their minds they charged the Athenians with madness which must be fatal, seeing that they were few and yet were pressing forwards at a run, having neither cavalry nor archers". Indeed, based on their previous experience of the Greeks, the Persians might be excused for this; Herodotus tells us that the Athenians at Marathon were "first to endure looking at Median dress and men wearing it, for up until then just hearing the name of the Medes caused the Hellenes to panic". Passing through the hail of arrows launched by the Persian army, protected for the most part by their armour, the Greek line finally made contact with the enemy army. Fourth phase: the Persian wings collapse. The Athenian wings quickly routed the inferior Persian levies on the flanks, before turning inwards to surround the Persian centre, which had been more successful against the thin Greek centre. Fifth phase: the Persians are routed and retreat to their ships. The battle ended when the Persian centre then broke in panic towards their ships, pursued by the Greeks. Some, unaware of the local terrain, ran towards the swamps where unknown numbers drowned. The Athenians pursued the Persians back to their ships, and managed to capture seven ships, though the majority were able to launch successfully. Herodotus recounts the story that Cynaegirus, brother of the playwright Aeschylus, who was also among the fighters, charged into the sea, grabbed one Persian trireme, and started pulling it towards shore. A member of the crew saw him, cut off his hand, and Cynaegirus died. Herodotus records that 6,400 Persian bodies were counted on the battlefield, and it is unknown how many more perished in the swamps. He also reported that the Athenians lost 192 men and the Plataeans 11. Among the dead were the war archon Callimachus and the general Stesilaos. Conclusions. There are several explanations of the Greek success. Most scholars believe that the Greeks had better equipment and used superior tactics. According to Herodotus, the Greeks were better equipped. They did not use bronze upper body armour at this time, but that of leather or linen. The phalanx formation proved successful, because the hoplites had a long tradition in hand-to-hand combat, whereas the Persian soldiers were accustomed to a very different kind of conflict. At Marathon, the Athenians thinned their centre in order to make their army equal in length to the Persian army, not as a result of a tactical planning. It seems that the Persian centre tried to return, realizing that their wings had broken, and was caught in the flanks by the victorious Greek wings. Lazenby (1993) believes that the ultimate reason for the Greek success was the courage the Greeks displayed: Marathon was won because ordinary, amateur soldiers found the courage to break into a trot when the arrows began to fall, instead of grinding to a halt, and when surprisingly the enemy wings fled, not to take the easy way out and follow them, but to stop and somehow come to the aid of the hard pressured centre. According to Vic Hurley, the Persian defeat is explained by the "complete failure ... to field a representative army", calling the battle the "most convincing" example of the fact that infantry-bowmen cannot defend any position while stationed in close-quarters and unsupported (i.e. by fortifications, or failing to support them by cavalry and chariots, as was the common Persian tactic). Aftermath. In the immediate aftermath of the battle, Herodotus says that the Persian fleet sailed around Cape Sounion to attack Athens directly. As has been discussed above, some modern historians place this attempt just before the battle. Either way, the Athenians evidently realised that their city was still under threat, and marched as quickly as possible back to Athens. The two tribes which had been in the centre of the Athenian line stayed to guard the battlefield under the command of Aristides. The Athenians arrived in time to prevent the Persians from securing a landing, and seeing that the opportunity was lost, the Persians turned about and returned to Asia. Connected with this episode, Herodotus recounts a rumour that this manoeuver by the Persians had been planned in conjunction with the Alcmaeonids, the prominent Athenian aristocratic family, and that a "shield-signal" had been given after the battle. Although many interpretations of this have been offered, it is impossible to tell whether this was true, and if so, what exactly the signal meant. On the next day, the Spartan army arrived at Marathon, having covered the in only three days. The Spartans toured the battlefield at Marathon, and agreed that the Athenians had won a great victory. The Athenian and Plataean dead of Marathon were buried on the battlefield in two tumuli. On the tomb of the Athenians this epigram composed by Simonides was written: Meanwhile, Darius began raising a huge new army with which he meant to completely subjugate Greece; however, in 486 BC, his Egyptian subjects revolted, indefinitely postponing any Greek expedition. Darius then died whilst preparing to march on Egypt, and the throne of Persia passed to his son Xerxes I. Xerxes crushed the Egyptian revolt, and very quickly restarted the preparations for the invasion of Greece. The epic second Persian invasion of Greece finally began in 480 BC, and the Persians met with initial success at the battles of Thermopylae and Artemisium. Defeat at the Battle of Salamis happened after Xerxes burnt Athens to the ground after Athenians left the city, and the next year the expedition was ended by the decisive Greek victory at the Battle of Plataea. Significance. The defeat at Marathon barely touched the vast resources of the Persian empire, yet for the Greeks it was an enormously significant victory. It was the first time the Greeks had beaten the Persians, proving that the Persians were not invincible, and that resistance, rather than subjugation, was possible. The battle was a defining moment for the young Athenian democracy, showing what might be achieved through unity and self-belief; indeed, the battle effectively marks the start of a "golden age" for Athens. This was also applicable to Greece as a whole; "their victory endowed the Greeks with a faith in their destiny that was to endure for three centuries, during which Western culture was born". John Stuart Mill's famous opinion was that "the Battle of Marathon, even as an event in British history, is more important than the Battle of Hastings". According to Isaac Asimov, "if the Athenians had lost in Marathon, . . . Greece might have never gone to develop the peak of its civilization, a peak whose fruits we moderns have inherited." It seems that the Athenian playwright Aeschylus considered his participation at Marathon to be his greatest achievement in life (rather than his plays) since on his gravestone there was the following epigram: Militarily, a major lesson for the Greeks was the potential of the hoplite phalanx. This style had developed during internecine warfare amongst the Greeks; since each city-state fought in the same way, the advantages and disadvantages of the hoplite phalanx had not been obvious. Marathon was the first time a phalanx faced more lightly armed troops, and revealed how effective the hoplites could be in battle. The phalanx formation was still vulnerable to cavalry (the cause of much caution by the Greek forces at the Battle of Plataea), but used in the right circumstances, it was now shown to be a potentially devastating weapon. Sources. The main source for the Greco-Persian Wars is the Greek historian Herodotus. Herodotus, who has been called the "Father of History", was born in 484 BC in Halicarnassus, Asia Minor (then under Persian overlordship). He wrote his "Enquiries" (Greek – "Historiai"; English – "(The) Histories") around 440–430 BC, trying to trace the origins of the Greco-Persian Wars, which would still have been relatively recent history (the wars finally ended in 450 BC). Herodotus's approach was entirely novel, and at least in Western society, he does seem to have invented "history" as we know it. As Holland has it: "For the first time, a chronicler set himself to trace the origins of a conflict not to a past so remote so as to be utterly fabulous, nor to the whims and wishes of some god, nor to a people's claim to manifest destiny, but rather explanations he could verify personally." Some subsequent ancient historians, despite following in his footsteps, criticised Herodotus, starting with Thucydides. Nevertheless, Thucydides chose to begin his history where Herodotus left off (at the Siege of Sestos), and may therefore have felt that Herodotus's history was accurate enough not to need re-writing or correcting. Plutarch criticised Herodotus in his essay "On the malice of Herodotus", describing Herodotus as "Philobarbaros" (barbarian-lover), for not being pro-Greek enough, which suggests that Herodotus might actually have done a reasonable job of being even-handed. A negative view of Herodotus was passed on to Renaissance Europe, though he remained well read. However, since the 19th century his reputation has been dramatically rehabilitated by archaeological finds which have repeatedly confirmed his version of events. The prevailing modern view is that Herodotus generally did a remarkable job in his "Historiai", but that some of his specific details (particularly troop numbers and dates) should be viewed with skepticism. Nevertheless, there are still some historians who believe Herodotus made up much of his story. The Sicilian historian Diodorus Siculus, writing in the 1st century BC in his "Bibliotheca Historica", also provides an account of the Greco-Persian wars, partially derived from the earlier Greek historian Ephorus. This account is fairly consistent with Herodotus's. The Greco-Persian wars are also described in less detail by a number of other ancient historians including Plutarch, Ctesias of Cnidus, and are alluded by other authors, such as the playwright Aeschylus. Archaeological evidence, such as the Serpent Column, also supports some of Herodotus's specific claims. Legacy. Legends associated with the battle. The most famous legend associated with Marathon is that of the runner Pheidippides (or Philippides) bringing news to Athens of the battle, which is described below. Pheidippides' run to Sparta to bring aid has other legends associated with it. Herodotus mentions that Pheidippides was visited by the god Pan on his way to Sparta (or perhaps on his return journey). Pan asked why the Athenians did not honor him and the awed Pheidippides promised that they would do so from then on. The god apparently felt that the promise would be kept, so he appeared in battle and at the crucial moment he instilled the Persians with his own brand of fear, the mindless, frenzied fear that bore his name: "panic". After the battle, a sacred precinct was established for Pan in a grotto on the north slope of the Acropolis, and a sacrifice was annually offered. Similarly, after the victory the festival of the "Agroteras Thysia" ("sacrifice to the Agrotéra") was held at Agrae near Athens, in honor of Artemis Agrotera ("Artemis the Huntress"). This was in fulfillment of a vow made by the city before the battle, to offer in sacrifice a number of goats equal to that of the Persians slain in the conflict. The number was so great, it was decided to offer 500 goats yearly until the number was filled. Xenophon notes that at his time, 90 years after the battle, goats were still offered yearly. Plutarch mentions that the Athenians saw the phantom of King Theseus, the mythical hero of Athens, leading the army in full battle gear in the charge against the Persians, and indeed he was depicted in the mural of the Stoa Poikile fighting for the Athenians, along with the twelve Olympian gods and other heroes. Pausanias also tells us that: They say too that there chanced to be present in the battle a man of rustic appearance and dress. Having slaughtered many of the foreigners with a plough he was seen no more after the engagement. When the Athenians made enquiries at the oracle, the god merely ordered them to honor Echetlaeus ("he of the Plough-tail") as a hero. Another tale from the conflict is of the dog of Marathon. Aelian relates that one hoplite brought his dog to the Athenian encampment. The dog followed his master to battle and attacked the Persians at his master's side. He also informs us that this dog is depicted in the mural of the Stoa Poikile. Marathon run. According to Herodotus, an Athenian runner named Pheidippides was sent to run from Athens to Sparta to ask for assistance before the battle. He ran a distance of over 225 kilometers (140 miles), arriving in Sparta the day after he left. Then, following the battle, the Athenian army marched the 40 kilometers (25 miles) or so back to Athens at a very high pace (considering the quantity of armour, and the fatigue after the battle), in order to head off the Persian force sailing around Cape Sounion. They arrived back in the late afternoon, in time to see the Persian ships turn away from Athens, thus completing the Athenian victory. Later, in popular imagination, these two events were conflated, leading to a legendary but inaccurate version of events. This myth has Pheidippides running from Marathon to Athens after the battle, to announce the Greek victory with the word "nenikēkamen!" (Attic: ; we've won!), whereupon he promptly died of exhaustion. This story first appears in Plutarch's "On the Glory of Athens" in the 1st century AD, who quotes from Heracleides of Pontus's lost work, giving the runner's name as either Thersipus of Erchius or Eucles. Lucian of Samosata (2nd century AD) gives the same story but names the runner Philippides (not Pheidippides). In some medieval codices of Herodotus, the name of the runner between Athens and Sparta before the battle is given as Philippides, and this name is also preferred in a few modern editions. When the idea of a modern Olympics became a reality at the end of the 19th century, the initiators and organizers were looking for a great popularizing event, recalling the ancient glory of Greece. The idea of organizing a "marathon race" came from Michel Bréal, who wanted the event to feature in the first modern Olympic Games in 1896 in Athens. This idea was heavily supported by Pierre de Coubertin, the founder of the modern Olympics, as well as the Greeks. This would echo the legendary version of events, with the competitors running from Marathon to Athens. So popular was this event that it quickly caught on, becoming a fixture at the Olympic games, with major cities staging their own annual events. The distance eventually became fixed at , though for the first years it was variable, being around —the approximate distance from Marathon to Athens.
4810
9134799
https://en.wikipedia.org/wiki?curid=4810
Balance of trade
Balance of trade is the difference between the monetary value of a nation's exports and imports of goods over a certain time period. Sometimes, trade in services is also included in the balance of trade but the official IMF definition only considers goods. The balance of trade measures a flow variable of exports and imports over a given period of time. The notion of the balance of trade does not mean that exports and imports are "in balance" with each other. If a country exports a greater value than it imports, it has a trade surplus or positive trade balance, and conversely, if a country imports a greater value than it exports, it has a trade deficit or negative trade balance. As of 2016, about 60 out of 200 countries have a trade surplus. The idea that a trade deficit is detrimental to a nation's economy is often rejected by modern trade experts and economists. The notion that bilateral trade deficits are bad in and of themselves is overwhelmingly rejected by trade experts and economists. Explanation. The balance of trade forms part of the current account, which includes other transactions such as income from the net international investment position as well as international aid. If the current account is in surplus, the country's net international asset position increases correspondingly. Equally, a deficit decreases the net international asset position. The trade balance is identical to the difference between a country's output and its domestic demand (the difference between what goods a country produces and how many goods it buys from abroad; this does not include money re-spent on foreign stock, nor does it factor in the concept of importing goods to produce for the domestic market). Measuring the balance of trade can be problematic because of problems with recording and collecting data. As an illustration of this problem, when official data for all the world's countries are added up, exports exceed imports by almost 1%; it appears the world is running a positive balance of trade with itself. This cannot be true, because all transactions involve an equal credit or debit in the account of each nation. The discrepancy is widely believed to be explained by transactions intended to launder money or evade taxes, smuggling and other visibility problems. While the accuracy of developing countries' statistics would be suspicious, most of the discrepancy actually occurs between developed countries of trusted statistics. Factors that can affect the balance of trade include: In addition, the trade balance is likely to differ across the business cycle. In export-led growth (such as oil and early industrial goods), the balance of trade will shift towards exports during an economic expansion. However, with domestic demand-led growth (as in the United States and Australia) the trade balance will shift towards imports at the same stage in the business cycle. The monetary balance of trade is different from the physical balance of trade (which is expressed in amount of raw materials, known also as Total Material Consumption). Developed countries usually import a substantial amount of raw materials from developing countries. Typically, these imported materials are transformed into finished products and might be exported after adding value. Financial trade balance statistics conceal material flow. Most developed countries have a large physical trade deficit because they consume more raw materials than they produce. Examples. Historical example. Many countries in early modern Europe adopted a policy of mercantilism, which theorized that a trade surplus was beneficial to a country. Mercantilist ideas also influenced how European nations regulated trade policies with their colonies, promoting the idea that natural resources and cash crops should be exported to Europe, with processed goods being exported back to the colonies in return. Ideas such as bullionism spurred the popularity of mercantilism in European governments. An early statement concerning the balance of trade appeared in "Discourse of the Common Wealth of this Realm of England", 1549: "We must always take heed that we buy no more from strangers than we sell them, for so should we impoverish ourselves and enrich them." Similarly, a systematic and coherent explanation of balance of trade was made public through Thomas Mun's 1630 "England's treasure by foreign trade, or, The balance of our foreign trade is the rule of our treasure". Since the mid-1980s, the United States has had a growing deficit in tradeable goods, especially with Asian nations (China and Japan) which now hold large sums of U.S. debt that has in part funded the consumption. The U.S. has a trade surplus with nations such as Australia. The issue of trade deficits can be complex. Trade deficits generated in tradeable goods such as manufactured goods or software may impact domestic employment to different degrees than do trade deficits in raw materials. Economies that have savings surpluses, such as Japan and Germany, typically run trade surpluses. China, a high-growth economy, has tended to run trade surpluses. A higher savings rate generally corresponds to a trade surplus. Correspondingly, the U.S. with its lower savings rate has tended to run high trade deficits, especially with Asian nations. Some have said that China pursues a mercantilist economic policy. Russia pursues a policy based on protectionism, according to which international trade is not a win–win game but a zero-sum game: surplus countries get richer at the expense of deficit countries. Views on economic impact. The notion that bilateral trade deficits are bad in and of themselves is overwhelmingly rejected by trade experts and economists. According to the IMF trade deficits can cause a balance of payments problem, which can affect foreign exchange shortages and hurt countries. On the other hand, Joseph Stiglitz points out that countries running surpluses exert a "negative externality" on trading partners, and pose a threat to global prosperity, far more than those in deficit. Ben Bernanke argues that "persistent imbalances within the euro zone are... unhealthy, as they lead to financial imbalances as well as to unbalanced growth. The fact that Germany is selling so much more than it is buying redirects demand from its neighbors (as well as from other countries around the world), reducing output and employment outside Germany." According to Carla Norrlöf, there are three main benefits to trade deficits for the United States: A 2018 National Bureau of Economic Research paper by economists at the International Monetary Fund and University of California, Berkeley, found in a study of 151 countries over 1963–2014 that the imposition of tariffs had little effect on the trade balance. Keynesian theory. In the last few years of his life, John Maynard Keynes was much preoccupied with the question of balance in international trade. He was the leader of the British delegation to the United Nations Monetary and Financial Conference in 1944 that established the Bretton Woods system of international currency management. He was the principal author of a proposal – the so-called Keynes Plan – for an International Clearing Union. The two governing principles of the plan were that the problem of settling outstanding balances should be solved by "creating" additional "international money", and that debtor and creditor should be treated almost alike as disturbers of equilibrium. In the event, though, the plans were rejected, in part because "American opinion was naturally reluctant to accept the principle of equality of treatment so novel in debtor-creditor relationships". The new system is not founded on free-trade (liberalisation of foreign trade) but rather on the regulation of international trade, in order to eliminate trade imbalances: the nations with a surplus would have a powerful incentive to get rid of it, and in doing so they would automatically clear other nations' deficits. He proposed a global bank that would issue its own currency – the bancor – which was exchangeable with national currencies at fixed rates of exchange and would become the unit of account between nations, which means it would be used to measure a country's trade deficit or trade surplus. Every country would have an overdraft facility in its bancor account at the International Clearing Union. He pointed out that surpluses lead to weak global aggregate demand – countries running surpluses exert a "negative externality" on trading partners, and posed far more than those in deficit, a threat to global prosperity. In ""National Self-Sufficiency" The Yale Review, Vol. 22, no. 4 (June 1933)", he already highlighted the problems created by free trade. His view, supported by many economists and commentators at the time, was that creditor nations may be just as responsible as debtor nations for disequilibrium in exchanges and that both should be under an obligation to bring trade back into a state of balance. Failure for them to do so could have serious consequences. In the words of Geoffrey Crowther, then editor of "The Economist", "If the economic relationships between nations are not, by one means or another, brought fairly close to balance, then there is no set of financial arrangements that can rescue the world from the impoverishing results of chaos." These ideas were informed by events prior to the Great Depression when – in the opinion of Keynes and others – international lending, primarily by the U.S., exceeded the capacity of sound investment and so got diverted into non-productive and speculative uses, which in turn invited default and a sudden stop to the process of lending. Influenced by Keynes, economics texts in the immediate post-war period put a significant emphasis on balance in trade. For example, the second edition of the popular introductory textbook, "An Outline of Money", devoted the last three of its ten chapters to questions of foreign exchange management and in particular the 'problem of balance'. However, in more recent years, since the end of the Bretton Woods system in 1971, with the increasing influence of monetarist schools of thought in the 1980s, and particularly in the face of large sustained trade imbalances, these concerns – and particularly concerns about the destabilising effects of large trade surpluses – have largely disappeared from mainstream economics discourse and Keynes' insights have slipped from view. Monetarist theory. Prior to 20th-century monetarist theory, the 19th-century economist and philosopher Frédéric Bastiat expressed the idea that trade deficits actually were a manifestation of profit, rather than a loss. He proposed as an example to suppose that he, a Frenchman, exported French wine and imported British coal, turning a profit. He supposed he was in France and sent a cask of wine which was worth 50 francs to England. The customhouse would record an export of 50 francs. If in England, the wine sold for 70 francs (or the pound equivalent), which he then used to buy coal, which he imported into France (the customhouse would record an import of 70 francs), and was found to be worth 90 francs in France, he would have made a profit of 40 francs. But the customhouse would say that the value of imports exceeded that of exports and was trade deficit of 20 against the ledger of France. This is not true for the current account that would be in surplus. By "reductio ad absurdum", Bastiat argued that the national trade deficit was an indicator of a successful economy, rather than a failing one. Bastiat predicted that a successful, growing economy would result in greater trade deficits, and an unsuccessful, shrinking economy would result in lower trade deficits. This was later, in the 20th century, echoed by economist Milton Friedman. In the 1980s, Friedman, a Nobel Memorial Prize-winning economist and a proponent of monetarism, contended that some of the concerns of trade deficits are unfair criticisms in an attempt to push macroeconomic policies favorable to exporting industries. Friedman argued that trade deficits are not necessarily important, as high exports raise the value of the currency, reducing aforementioned exports, and vice versa for imports, thus naturally removing trade deficits "not due to investment". Since 1971, when the Nixon administration decided to abolish fixed exchange rates, America's Current Account accumulated trade deficits have totaled $7.75 trillion as of 2010. This deficit exists as it is matched by investment coming into the United States – purely by the definition of the balance of payments, any current account deficit that exists is matched by an inflow of foreign investment. In the late 1970s and early 1980s, the U.S. had experienced high inflation and Friedman's policy positions tended to defend the stronger dollar at that time. He stated his belief that these trade deficits were not necessarily harmful to the economy at the time since the currency comes back to the country (country A sells to country B, country B sells to country C who buys from country A, but the trade deficit only includes A and B). However, it may be in one form or another including the possible tradeoff of foreign control of assets. In his view, the "worst-case scenario" of the currency never returning to the country of origin was actually the best possible outcome: the country actually purchased its goods by exchanging them for pieces of cheaply made paper. As Friedman put it, this would be the same result as if the exporting country burned the dollars it earned, never returning it to market circulation. This position is a more refined version of the theorem first discovered by David Hume. Hume argued that England could not permanently gain from exports, because hoarding gold (i.e., currency) would make gold more plentiful in England; therefore, the prices of English goods would rise, making them less attractive exports and making foreign goods more attractive imports. In this way, countries' trade balances would balance out. Friedman presented his analysis of the balance of trade in "Free to Choose", widely considered his most significant popular work. Trade balance’s effects upon a nation's GDP. Exports directly increase and imports directly reduce a nation's balance of trade (i.e. net exports). A trade surplus is a positive net balance of trade, and a trade deficit is a negative net balance of trade. Due to the balance of trade being explicitly added to the calculation of the nation's gross domestic product using the expenditure method of calculating gross domestic product (i.e. GDP), trade surpluses are contributions and trade deficits are "drags" upon their nation's GDP; however, foreign made goods sold (e.g., retail) contribute to total GDP.
4816
7903804
https://en.wikipedia.org/wiki?curid=4816
Biosphere
The biosphere (), also called the ecosphere (), is the worldwide sum of all ecosystems. It can also be termed the zone of life on the Earth. The biosphere (which is technically a spherical shell) is virtually a closed system with regard to matter, with minimal inputs and outputs. Regarding energy, it is an open system, with photosynthesis capturing solar energy at a rate of around 100 terawatts. By the most general biophysiological definition, the biosphere is the global ecological system integrating all living beings and their relationships, including their interaction with the elements of the lithosphere, cryosphere, hydrosphere, and atmosphere. The biosphere is postulated to have evolved, beginning with a process of biopoiesis (life created naturally from matter, such as simple organic compounds) or biogenesis (life created from living matter), at least some 3.5 billion years ago. In a general sense, biospheres are any closed, self-regulating systems containing ecosystems. This includes artificial biospheres such as and , and potentially ones on other planets or moons. Origin and use of the term. The term "biosphere" was coined in 1875 by geologist Eduard Suess, who defined it as the place on Earth's surface where life dwells. While the concept has a geological origin, it is an indication of the effect of both Charles Darwin and Matthew F. Maury on the Earth sciences. The biosphere's ecological context comes from the 1920s (see Vladimir I. Vernadsky), preceding the 1935 introduction of the term "ecosystem" by Sir Arthur Tansley (see ecology history). Vernadsky defined ecology as the science of the biosphere. It is an interdisciplinary concept for integrating astronomy, geophysics, meteorology, biogeography, evolution, geology, geochemistry, hydrology and, generally speaking, all life and Earth sciences. Narrow definition. Geochemists define the biosphere as being the total sum of living organisms (the "biomass" or "biota" as referred to by biologists and ecologists). In this sense, the biosphere is but one of four separate components of the geochemical model, the other three being "geosphere", "hydrosphere", and "atmosphere". When these four component spheres are combined into one system, it is known as the ecosphere. This term was coined during the 1960s and encompasses both biological and physical components of the planet. The Second International Conference on Closed Life Systems defined "biospherics" as the science and technology of analogs and models of Earth's biosphere; i.e., artificial Earth-like biospheres. Others may include the creation of artificial non-Earth biospheres—for example, human-centered biospheres or a native Martian biosphere—as part of the topic of biospherics. Earth's biosphere. Overview. Currently, the total number of living cells on the Earth is estimated to be 1030; the total number since the beginning of Earth, as 1040, and the total number for the entire time of a habitable planet Earth as 1041. This is much larger than the total number of estimated stars (and Earth-like planets) in the observable universe as 1024, a number which is more than all the grains of beach sand on planet Earth; but less than the total number of atoms estimated in the observable universe as 1082; and the estimated total number of stars in an inflationary universe (observed and unobserved), as 10100. Age. The earliest evidence for life on Earth includes biogenic graphite found in 3.7 billion-year-old metasedimentary rocks from Western Greenland and microbial mat fossils found in 3.48 billion-year-old sandstone from Western Australia. More recently, in 2015, "remains of biotic life" were found in 4.1 billion-year-old rocks in Western Australia. In 2017, putative fossilized microorganisms (or microfossils) were announced to have been discovered in hydrothermal vent precipitates in the Nuvvuagittuq Belt of Quebec, Canada that were as old as 4.28 billion years, the oldest record of life on earth, suggesting "an almost instantaneous emergence of life" after ocean formation 4.4 billion years ago, and not long after the formation of the Earth 4.54 billion years ago. According to biologist Stephen Blair Hedges, "If life arose relatively quickly on Earth ... then it could be common in the universe." Extent. Every part of the planet, from the polar ice caps to the equator, features life of some kind. Recent advances in microbiology have demonstrated that microbes live deep beneath the Earth's terrestrial surface and that the total mass of microbial life in so-called "uninhabitable zones" may, in biomass, exceed all animal and plant life on the surface. The actual thickness of the biosphere on Earth is difficult to measure. Birds typically fly at altitudes as high as and fish live as much as underwater in the Puerto Rico Trench. There are more extreme examples for life on the planet: Rüppell's vulture has been found at altitudes of ; bar-headed geese migrate at altitudes of at least ; yaks live at elevations as high as above sea level; mountain goats live up to . Herbivorous animals at these elevations depend on lichens, grasses, and herbs. Life forms live in every part of the Earth's biosphere, including soil, hot springs, inside rocks at least deep underground, and at least high in the atmosphere. Marine life under many forms has been found in the deepest reaches of the world ocean while much of the deep sea remains to be explored. Under certain test conditions, microorganisms have been observed to survive the vacuum of outer space. The total amount of soil and subsurface bacterial carbon is estimated as 5 × 1017 g. The mass of prokaryote microorganisms—which includes bacteria and archaea, but not the nucleated eukaryote microorganisms—may be as much as 0.8 trillion tons of carbon (of the total biosphere mass, estimated at between 1 and 4 trillion tons). Barophilic marine microbes have been found at more than a depth of in the Mariana Trench, the deepest spot in the Earth's oceans. In fact, single-celled life forms have been found in the deepest part of the Mariana Trench, by the Challenger Deep, at depths of . Other researchers reported related studies that microorganisms thrive inside rocks up to below the sea floor under of ocean off the coast of the northwestern United States, as well as beneath the seabed off Japan. Culturable thermophilic microbes have been extracted from cores drilled more than into the Earth's crust in Sweden, from rocks between . Temperature increases with increasing depth into the Earth's crust. The rate at which the temperature increases depends on many factors, including the type of crust (continental vs. oceanic), rock type, geographic location, etc. The greatest known temperature at which microbial life can exist is ("Methanopyrus kandleri" Strain 116). It is likely that the limit of life in the "deep biosphere" is defined by temperature rather than absolute depth. On 20 August 2014, scientists confirmed the existence of microorganisms living below the ice of Antarctica. Earth's biosphere is divided into several biomes, inhabited by fairly similar flora and fauna. On land, biomes are separated primarily by latitude. Terrestrial biomes lying within the Arctic and Antarctic Circles are relatively barren of plant and animal life. In contrast, most of the more populous biomes lie near the equator. Artificial biospheres. Experimental biospheres, also called closed ecological systems, have been created to study ecosystems and the potential for supporting life outside the Earth. These include spacecraft and the following terrestrial laboratories: Extraterrestrial biospheres. No biospheres have been detected beyond the Earth; therefore, the existence of extraterrestrial biospheres remains hypothetical. The rare Earth hypothesis suggests they should be very rare, save ones composed of microbial life only. On the other hand, Earth analogs may be quite numerous, at least in the Milky Way galaxy, given the large number of planets. Three of the planets discovered orbiting TRAPPIST-1 could possibly contain biospheres. Given limited understanding of abiogenesis, it is currently unknown what percentage of these planets actually develop biospheres. Based on observations by the Kepler Space Telescope team, it has been calculated that provided the probability of abiogenesis is higher than 1 to 1000, the closest alien biosphere should be within 100 light-years from the Earth. It is also possible that artificial biospheres will be created in the future, for example with the terraforming of Mars.
4817
15040601
https://en.wikipedia.org/wiki?curid=4817
Biological membrane
A biological membrane or biomembrane is a selectively permeable membrane that separates the interior of a cell from the external environment or creates intracellular compartments by serving as a boundary between one part of the cell and another. Biological membranes, in the form of eukaryotic cell membranes, consist of a phospholipid bilayer with embedded, integral and peripheral proteins used in communication and transportation of chemicals and ions. The bulk of lipids in a cell membrane provides a fluid matrix for proteins to rotate and laterally diffuse for physiological functioning. Proteins are adapted to high membrane fluidity environment of the lipid bilayer with the presence of an annular lipid shell, consisting of lipid molecules bound tightly to the surface of integral membrane proteins. The cell membranes are different from the isolating tissues formed by layers of cells, such as mucous membranes, basement membranes, and serous membranes. Composition. Asymmetry. The lipid bilayer consists of two layers- an outer leaflet and an inner leaflet. The components of bilayers are distributed unequally between the two surfaces to create asymmetry between the outer and inner surfaces. This asymmetric organization is important for cell functions such as cell signaling. The asymmetry of the biological membrane reflects the different functions of the two leaflets of the membrane. As seen in the fluid membrane model of the phospholipid bilayer, the outer leaflet and inner leaflet of the membrane are asymmetrical in their composition. Certain proteins and lipids rest only on one surface of the membrane and not the other. Using selective flippases is not the only way to produce asymmetry in lipid bilayers, however. In particular, a different mechanism operates for glycolipids—the lipids that show the most striking and consistent asymmetric distribution in animal cells. Lipids. The biological membrane is made up of lipids with hydrophobic tails and hydrophilic heads. The hydrophobic tails are hydrocarbon tails whose length and saturation is important in characterizing the cell. Lipid rafts occur when lipid species and proteins aggregate in domains in the membrane. These help organize membrane components into localized areas that are involved in specific processes, such as signal transduction. Red blood cells, or erythrocytes, have a unique lipid composition. The bilayer of red blood cells is composed of cholesterol and phospholipids in equal proportions by weight. Erythrocyte membrane plays a crucial role in blood clotting. In the bilayer of red blood cells is phosphatidylserine. This is usually in the cytoplasmic side of the membrane. However, it is flipped to the outer membrane to be used during blood clotting. Proteins. Phospholipid bilayers contain different proteins. These membrane proteins have various functions and characteristics and catalyze different chemical reactions. Integral proteins span the membranes with different domains on either side. Integral proteins hold strong association with the lipid bilayer and cannot easily become detached. They will dissociate only with chemical treatment that breaks the membrane. Peripheral proteins are unlike integral proteins in that they hold weak interactions with the surface of the bilayer and can easily become dissociated from the membrane. Peripheral proteins are located on only one face of a membrane and create membrane asymmetry. Oligosaccharides. Oligosaccharides are sugar containing polymers. In the membrane, they can be covalently bound to lipids to form glycolipids or covalently bound to proteins to form glycoproteins. Membranes contain sugar-containing lipid molecules known as glycolipids. In the bilayer, the sugar groups of glycolipids are exposed at the cell surface, where they can form hydrogen bonds. Glycolipids provide the most extreme example of asymmetry in the lipid bilayer. Glycolipids perform a vast number of functions in the biological membrane that are mainly communicative, including cell recognition and cell-cell adhesion. Glycoproteins are integral proteins. They play an important role in the immune response and protection. Formation. The phospholipid bilayer is formed due to the aggregation of membrane lipids in aqueous solutions. Aggregation is caused by the hydrophobic effect, where hydrophobic ends come into contact with each other and are sequestered away from water. This arrangement maximises hydrogen bonding between hydrophilic heads and water while minimising unfavorable contact between hydrophobic tails and water. The increase in available hydrogen bonding increases the entropy of the system, creating a spontaneous process. Function. Biological molecules are amphiphilic or amphipathic, i.e. are simultaneously hydrophobic and hydrophilic. The phospholipid bilayer contains charged hydrophilic headgroups, which interact with polar water. The layers also contain hydrophobic tails, which meet with the hydrophobic tails of the complementary layer. The hydrophobic tails are usually fatty acids that differ in lengths. The interactions of lipids, especially the hydrophobic tails, determine the lipid bilayer physical properties such as fluidity. Membranes in cells typically define enclosed spaces or compartments in which cells may maintain a chemical or biochemical environment that differs from the outside. For example, the membrane around peroxisomes shields the rest of the cell from peroxides, chemicals that can be toxic to the cell, and the cell membrane separates a cell from its surrounding medium. Peroxisomes are one form of vacuole found in the cell that contain by-products of chemical reactions within the cell. Most organelles are defined by such membranes, and are called membrane-bound organelles. Selective permeability. Probably the most important feature of a biomembrane is that it is a selectively permeable structure. This means that the size, charge, and other chemical properties of the atoms and molecules attempting to cross it will determine whether they succeed in doing so. Selective permeability is essential for effective separation of a cell or organelle from its surroundings. Biological membranes also have certain mechanical or elastic properties that allow them to change shape and move as required. Generally, small hydrophobic molecules can readily cross phospholipid bilayers by simple diffusion. Particles that are required for cellular function but are unable to diffuse freely across a membrane enter through a membrane transport protein or are taken in by means of endocytosis, where the membrane allows for a vacuole to join onto it and push its contents into the cell. Many types of specialized plasma membranes can separate cell from external environment: apical, basolateral, presynaptic and postsynaptic ones, membranes of flagella, cilia, microvillus, filopodia and lamellipodia, the sarcolemma of muscle cells, as well as specialized myelin and dendritic spine membranes of neurons. Plasma membranes can also form different types of "supramembrane" structures such as caveolae, postsynaptic density, podosome, invadopodium, desmosome, hemidesmosome, focal adhesion, and cell junctions. These types of membranes differ in lipid and protein composition. Distinct types of membranes also create intracellular organelles: endosome; smooth and rough endoplasmic reticulum; sarcoplasmic reticulum; Golgi apparatus; lysosome; mitochondrion (inner and outer membranes); nucleus (inner and outer membranes); peroxisome; vacuole; cytoplasmic granules; cell vesicles (phagosome, autophagosome, clathrin-coated vesicles, COPI-coated and COPII-coated vesicles) and secretory vesicles (including synaptosome, acrosomes, melanosomes, and chromaffin granules). Different types of biological membranes have diverse lipid and protein compositions. The content of membranes defines their physical and biological properties. Some components of membranes play a key role in medicine, such as the efflux pumps that pump drugs out of a cell. Fluidity. The hydrophobic core of the phospholipid bilayer is constantly in motion because of rotations around the bonds of lipid tails. Hydrophobic tails of a bilayer bend and lock together. However, because of hydrogen bonding with water, the hydrophilic head groups exhibit less movement as their rotation and mobility are constrained. This results in increasing viscosity of the lipid bilayer closer to the hydrophilic heads. Below a transition temperature, a lipid bilayer loses fluidity when the highly mobile lipids exhibits less movement becoming a gel-like solid. The transition temperature depends on such components of the lipid bilayer as the hydrocarbon chain length and the saturation of its fatty acids. Temperature-dependence fluidity constitutes an important physiological attribute for bacteria and cold-blooded organisms. These organisms maintain a constant fluidity by modifying membrane lipid fatty acid composition in accordance with differing temperatures. In animal cells, membrane fluidity is modulated by the inclusion of the sterol cholesterol. This molecule is present in especially large amounts in the plasma membrane, where it constitutes approximately 20% of the lipids in the membrane by weight. Because cholesterol molecules are short and rigid, they fill the spaces between neighboring phospholipid molecules left by the kinks in their unsaturated hydrocarbon tails. In this way, cholesterol tends to stiffen the bilayer, making it more rigid and less permeable. For all cells, membrane fluidity is important for many reasons. It enables membrane proteins to diffuse rapidly in the plane of the bilayer and to interact with one another, as is crucial, for example, in cell signaling. It permits membrane lipids and proteins to diffuse from sites where they are inserted into the bilayer after their synthesis to other regions of the cell. It allows membranes to fuse with one another and mix their molecules, and it ensures that membrane molecules are distributed evenly between daughter cells when a cell divides. If biological membranes were not fluid, it is hard to imagine how cells could live, grow, and reproduce. The fluidity property is at the center of the Helfrich model which allows for calculating the energy cost of an elastic deformation to the membrane.
4819
44428017
https://en.wikipedia.org/wiki?curid=4819
Balfour Declaration of 1926
The Balfour Declaration of 1926 was issued by the 1926 Imperial Conference of British Empire leaders in London. It was named after Arthur Balfour, who was Lord President of the Council. It declared the United Kingdom and the Dominions to be: The Inter-Imperial Relations Committee, chaired by Balfour, drew up the document preparatory to its unanimous approval by the imperial prime ministers on 15 November 1926. It was first proposed by South African Prime Minister J. B. M. Hertzog and Canadian Prime Minister William Lyon Mackenzie King. The declaration accepted the growing political and diplomatic independence of the Dominions in the years after World War I. It also recommended that the governors-general, the representatives of the King in each dominion, should no longer also serve automatically as the representative of the British government in diplomatic relations between the countries. In following years, high commissioners were gradually appointed, whose duties were soon recognised to be virtually identical to those of an ambassador. The first such British high commissioner was appointed to Canada in 1928. The conclusions of the Imperial Conference of 1926 were re-stated by the 1930 conference and incorporated in the Statute of Westminster of December 1931. In the statute, the British Parliament provided that it would not enact a law which applied to a Dominion as part of the law of that Dominion, unless the law expressly stated that the Dominion government had requested and consented to the enactment of that law.
4820
234351
https://en.wikipedia.org/wiki?curid=4820
Balfour Declaration
The Balfour Declaration was a public statement issued by the British Government in 1917 during the First World War announcing its support for the establishment of a "national home for the Jewish people" in Palestine, then an Ottoman region with a small minority Jewish population. The declaration was contained in a letter dated 2November 1917 from Arthur Balfour, the British foreign secretary, to Lord Rothschild, a leader of the British Jewish community, for transmission to the Zionist Federation of Great Britain and Ireland. The text of the declaration was published in the press on 9November 1917. Following Britain's declaration of war on the Ottoman Empire in November 1914, it began to consider the future of Palestine. Within two months a memorandum was circulated to the War Cabinet by a Zionist member, Herbert Samuel, proposing the support of Zionist ambitions to enlist the support of Jews in the wider war. A committee was established in April 1915 by British prime minister H. H. Asquith to determine their policy towards the Ottoman Empire including Palestine. Asquith, who had favoured post-war reform of the Ottoman Empire, resigned in December 1916; his replacement David Lloyd George favoured partition of the Empire. The first negotiations between the British and the Zionists took place at a conference on 7 February 1917 that included Sir Mark Sykes and the Zionist leadership. Subsequent discussions led to Balfour's request, on 19 June, that Rothschild and Chaim Weizmann draft a public declaration. Further drafts were discussed by the British Cabinet during September and October, with input from Zionist and anti-Zionist Jews but with no representation from the local population in Palestine. By late 1917, the wider war had reached a stalemate, with two of Britain's allies not fully engaged: the United States had yet to suffer a casualty, and the Russians were in the midst of a revolution. A stalemate in southern Palestine was broken by the Battle of Beersheba on 31 October 1917. The release of the final declaration was authorised on 31 October; the preceding Cabinet discussion had referenced perceived propaganda benefits amongst the worldwide Jewish community for the Allied war effort. The opening words of the declaration represented the first public expression of support for Zionism by a major political power. The term "national home" had no precedent in international law, and was intentionally vague as to whether a Jewish state was contemplated. The intended boundaries of Palestine were not specified, and the British government later confirmed that the words "in Palestine" meant that the Jewish national home was not intended to cover all of Palestine. The second half of the declaration was added to satisfy opponents of the policy, who had claimed that it would otherwise prejudice the position of the local population of Palestine and encourage antisemitism worldwide by "stamping the Jews as strangers in their native lands". The declaration called for safeguarding the civil and religious rights for the Palestinian Arabs, who composed the vast majority of the local population, and also the rights and political status of the Jewish communities in countries outside of Palestine. The British government acknowledged in 1939 that the local population's wishes and interests should have been taken into account, and recognised in 2017 that the declaration should have called for the protection of the Palestinian Arabs' political rights. The declaration greatly increased popular support for Zionism within Jewish communities worldwide, and became a core component of the British Mandate for Palestine, the founding document of Mandatory Palestine. It indirectly led to the emergence of the State of Israel and is considered a principal cause of the ongoing Israeli–Palestinian conflictoften described as the most intractable in the world. Controversy remains over a number of areas, such as whether the declaration contradicted earlier promises the British made to the Sharif of Mecca in the McMahon–Hussein correspondence. Background. Early British support. Early British political support for an increased Jewish presence in the region of Palestine was based upon geopolitical calculations. This support began in the early 1840s and was led by Lord Palmerston, following the occupation of Syria and Palestine by separatist Ottoman governor Muhammad Ali of Egypt. French influence had grown in Palestine and the wider Middle East, and its role as protector of the Catholic communities began to grow, just as Russian influence had grown as protector of the Eastern Orthodox in the same regions. This left Britain without a sphere of influence, and thus a need to find or create their own regional "protégés". These political considerations were supported by a sympathetic evangelical Christian sentiment towards the "restoration of the Jews" to Palestine among elements of the mid-19th-century British political elite – most notably Lord Shaftesbury. The British Foreign Office actively encouraged Jewish emigration to Palestine, exemplified by Charles Henry Churchill's 1841–1842 exhortations to Moses Montefiore, the leader of the British Jewish community. Such efforts were premature, and did not succeed; only 24,000 Jews were living in Palestine on the eve of the emergence of Zionism within the world's Jewish communities in the last two decades of the 19th century. With the geopolitical shakeup occasioned by the outbreak of the First World War, the earlier calculations, which had lapsed for some time, led to a renewal of strategic assessments and political bargaining over the Middle and Far East. British anti-Semitism. Although other factors played their part, Jonathan Schneer says that stereotypical thinking by British officials about Jews also played a role in the decision to issue the Declaration. Robert Cecil, Hugh O’Bierne and Sir Mark Sykes all held an unrealistic view of "world Jewry", the former writing "I do not think it is possible to exaggerate the international power of the Jews." Zionist representatives saw advantage in encouraging such views. James Renton concurs, writing that the British foreign policy elite, including Prime Minister David Lloyd George and Foreign Secretary A.J. Balfour, believed that Jews possessed real and significant power that could be of use to them in the war. Early Zionism. Zionism arose in the late 19th century in reaction to anti-Semitic and exclusionary nationalist movements in Europe. Romantic nationalism in Central and Eastern Europe had helped to set off the Haskalah, or "Jewish Enlightenment", creating a split in the Jewish community between those who saw Judaism as their religion and those who saw it as their ethnicity or nation. The 1881–1884 anti-Jewish pogroms in the Russian Empire encouraged the growth of the latter identity, resulting in the formation of the Hovevei Zion pioneer organizations, the publication of Leon Pinsker's "Autoemancipation", and the first major wave of Jewish immigration to Palestine – retrospectively named the "First Aliyah". In 1896, Theodor Herzl, a Jewish journalist living in Austria-Hungary, published the foundational text of political Zionism, "Der Judenstaat" ("The Jews' State" or "The State of the Jews"), in which he asserted that the only solution to the "Jewish Question" in Europe, including growing anti-Semitism, was the establishment of a state for the Jews. A year later, Herzl founded the Zionist Organization, which at its first congress called for the establishment of "a home for the Jewish people in Palestine secured under public law". Proposed measures to attain that goal included the promotion of Jewish settlement there, the organisation of Jews in the diaspora, the strengthening of Jewish feeling and consciousness, and preparatory steps to attain necessary governmental grants. Herzl died in 1904, 44 years before the establishment of State of Israel, the Jewish state that he proposed, without having gained the political standing required to carry out his agenda. Zionist leader Chaim Weizmann, later President of the World Zionist Organisation and first President of Israel, moved from Switzerland to the UK in 1904 and met Arthur Balfour – who had just launched his 1905–1906 election campaign after resigning as Prime Minister – in a session arranged by Charles Dreyfus, his Jewish constituency representative. Earlier that year, Balfour had successfully driven the Aliens Act through Parliament with impassioned speeches regarding the need to restrict the wave of immigration into Britain from Jews fleeing the Russian Empire. During this meeting, he asked what Weizmann's objections had been to the 1903 Uganda Scheme that Herzl had supported to provide a portion of British East Africa to the Jewish people as a homeland. The scheme, which had been proposed to Herzl by Joseph Chamberlain, Colonial Secretary in Balfour's Cabinet, following his trip to East Africa earlier in the year, had been subsequently voted down following Herzl's death by the Seventh Zionist Congress in 1905 after two years of heated debate in the Zionist Organization. Weizmann responded that he believed the English are to London as the Jews are to Jerusalem. In January 1914 Weizmann first met Baron Edmond de Rothschild, a member of the French branch of the Rothschild family and a leading proponent of the Zionist movement, in relation to a project to build a Hebrew university in Jerusalem. The Baron was not part of the World Zionist Organization, but had funded the Jewish agricultural colonies of the First Aliyah and transferred them to the Jewish Colonization Association in 1899. This connection was to bear fruit later that year when the Baron's son, James deRothschild, requested a meeting with Weizmann on 25November 1914, to enlist him in influencing those deemed to be receptive within the British government to the Zionist agenda in Palestine. Through James's wife Dorothy, Weizmann was to meet Rózsika Rothschild, who introduced him to the English branch of the familyin particular her husband Charles and his older brother Walter, a zoologist and former Member of Parliament (MP). Their father, Nathan Rothschild, 1st Baron Rothschild, head of the English branch of the family, had a guarded attitude towards Zionism, but he died in March 1915 and his title was inherited by Walter.  Prior to the declaration, about 8,000 of Britain's 300,000 Jews belonged to a Zionist organisation. Globally, as of 1913 – the latest known date prior to the declaration – the equivalent figure was approximately 1%. Ottoman Palestine. The year 1916 marked four centuries since Palestine had become part of the Ottoman Empire, also known as the Turkish Empire. For most of this period, the Jewish population represented a small minority, approximately 3% of the total, with Muslims representing the largest segment of the population, and Christians the second. Ottoman government in Constantinople began to apply restrictions on Jewish immigration to Palestine in late 1882, in response to the start of the First Aliyah earlier that year. Although this immigration was creating a certain amount of tension with the local population, mainly among the merchant and notable classes, in 1901 the Sublime Porte (the Ottoman central government) gave Jews the same rights as Arabs to buy land in Palestine and the percentage of Jews in the population rose to 7% by 1914. At the same time, with growing distrust of the Young Turks (Turkish nationalists who had taken control of the Empire in 1908) and the Second Aliyah, Arab nationalism and Palestinian nationalism was on the rise; and in Palestine anti-Zionism was a characteristic that unified these forces. Historians cannot say whether these strengthening forces would still have ultimately resulted in conflict in the absence of the Balfour Declaration. First World War. 1914–16: Initial Zionist–British Government discussions. In July 1914 war broke out in Europe between the Triple Entente (Britain, France, and the Russian Empire) and the Central Powers (Germany, Austria-Hungary, and, later that year, the Ottoman Empire). The British Cabinet first discussed Palestine at a meeting on 9November 1914, four days after Britain's declaration of war on the Ottoman Empire, of which the Mutasarrifate of Jerusalemoften referred to as Palestinewas a component. At the meeting David Lloyd George, then Chancellor of the Exchequer, "referred to the ultimate destiny of Palestine". The Chancellor, whose law firm Lloyd George, Roberts and Co had been engaged a decade before by the Zionist Federation of Great Britain and Ireland to work on the Uganda Scheme, was to become Prime Minister by the time of the declaration, and was ultimately responsible for it. Weizmann's political efforts picked up speed, and on 10December 1914 he met with Herbert Samuel, a British Cabinet member and a secular Jew who had studied Zionism; Samuel believed Weizmann's demands were too modest. Two days later, Weizmann met Balfour again, for the first time since their initial meeting in 1905; Balfour had been out of government ever since his electoral defeat in 1906, but remained a senior member of the Conservative Party in their role as Official Opposition. A month later, Samuel circulated a memorandum entitled "The Future of Palestine" to his Cabinet colleagues. The memorandum stated: "I am assured that the solution of the problem of Palestine which would be much the most welcome to the leaders and supporters of the Zionist movement throughout the world would be the annexation of the country to the British Empire". Samuel discussed a copy of his memorandum with Nathan Rothschild in February 1915, a month before the latter's death. It was the first time in an official record that enlisting the support of Jews as a war measure had been proposed. Many further discussions followed, including the initial meetings in 1915–16 between Lloyd George, who had been appointed Minister of Munitions in May 1915, and Weizmann, who was appointed as a scientific advisor to the ministry in September 1915. Seventeen years later, in his "War Memoirs", Lloyd George described these meetings as being the "fount and origin" of the declaration; historians have rejected this claim. 1915–16: Prior British commitments over Palestine. In late 1915 the British High Commissioner to Egypt, Henry McMahon, exchanged ten letters with Hussein bin Ali, Sharif of Mecca, in which he promised Hussein to recognize Arab independence "in the limits and boundaries proposed by the Sherif of Mecca" in return for Hussein launching a revolt against the Ottoman Empire. The pledge excluded "portions of Syria" lying to the west of "the districts of Damascus, Homs, Hama and Aleppo". In the decades after the war, the extent of this coastal exclusion was hotly disputed since Palestine lay to the southwest of Damascus and was not explicitly mentioned. The Arab Revolt was launched on June5th, 1916, on the basis of the "quid pro quo" agreement in the correspondence. However, less than three weeks earlier the governments of the United Kingdom, France, and Russia secretly concluded the Sykes–Picot Agreement, which Balfour described later as a "wholly new method" for dividing the region, after the 1915 agreement "seems to have been forgotten". This Anglo-French treaty was negotiated in late 1915 and early 1916 between Sir Mark Sykes and François Georges-Picot, with the primary arrangements being set out in draft form in a joint memorandum on 5 January 1916. Sykes was a British Conservative MP who had risen to a position of significant influence on Britain's Middle East policy, beginning with his seat on the 1915 De Bunsen Committee and his initiative to create the Arab Bureau. Picot was a French diplomat and former consul-general in Beirut. Their agreement defined the proposed spheres of influence and control in Western Asia should the Triple Entente succeed in defeating the Ottoman Empire during World WarI, dividing many Arab territories into British- and French-administered areas. In Palestine, internationalisation was proposed, with the form of administration to be confirmed after consultation with both Russia and Hussein; the January draft noted Christian and Muslim interests, and that "members of the Jewish community throughout the world have a conscientious and sentimental interest in the future of the country." Prior to this point, no active negotiations with Zionists had taken place, but Sykes had been aware of Zionism, was in contact with Moses Gaster – a former President of the English Zionist Federation – and may have seen Samuel's 1915 memorandum. On 3 March, while Sykes and Picot were still in Petrograd, Lucien Wolf (secretary of the Foreign Conjoint Committee, set up by Jewish organizations to further the interests of foreign Jews) submitted to the Foreign Office, the draft of an assurance (formula) that could be issued by the allies in support of Jewish aspirations: In the event of Palestine coming within the spheres of influence of Great Britain or France at the close of the war, the governments of those powers will not fail to take account of the historic interest that country possesses for the Jewish community. The Jewish population will be secured in the enjoyment of civil and religious liberty, equal political rights with the rest of the population, reasonable facilities for immigration and colonisation, and such municipal privileges in the towns and colonies inhabited by them as may be shown to be necessary. On 11 March, telegrams were sent in Grey's name to Britain's Russian and French ambassadors for transmission to Russian and French authorities, including the formula, as well as: The scheme might be made far more attractive to the majority of Jews if it held out to them the prospect that when in course of time the Jewish colonists in Palestine grow strong enough to cope with the Arab population they may be allowed to take the management of the internal affairs of Palestine (with the exception of Jerusalem and the holy places) into their own hands. Sykes, having seen the telegram, had discussions with Picot and proposed (making reference to Samuel's memorandum) the creation of an Arab Sultanate under French and British protection, some means of administering the holy places along with the establishment of a company to purchase land for Jewish colonists, who would then become citizens with equal rights to Arabs. Shortly after returning from Petrograd, Sykes briefed Samuel, who then briefed a meeting of Gaster, Weizmann and Sokolow. Gaster recorded in his diary on 16 April 1916: "We are offered French-English condominium in Palest[ine]. Arab Prince to conciliate Arab sentiment and as part of the Constitution a Charter to Zionists for which England would stand guarantee and which would stand by us in every case of friction ... It practically comes to a complete realisation of our Zionist programme. However, we insisted on: national character of Charter, freedom of immigration and internal autonomy, and at the same time full rights of citizenship to [illegible] and Jews in Palestine." In Sykes's mind, the agreement which bore his name was outdated even before it was signed – in March 1916, he wrote in a private letter: "to my mind the Zionists are now the key of the situation". In the event, neither the French nor the Russians were enthusiastic about the proposed formulation and eventually on 4 July, Wolf was informed that "the present moment is inopportune for making any announcement." These wartime initiatives, inclusive of the declaration, are frequently considered together by historians because of the potential, real or imagined, for incompatibility between them, particularly in regard to the disposition of Palestine. In the words of Professor Albert Hourani, founder of the Middle East Centre at St Antony's College, Oxford: "The argument about the interpretation of these agreements is one which is impossible to end, because they were intended to bear more than one interpretation." 1916–17: Change in British Government. In terms of British politics, the declaration resulted from the coming into power of Lloyd George and his Cabinet, which had replaced the H. H. Asquith led-Cabinet in December 1916. Whilst both Prime Ministers were Liberals and both governments were wartime coalitions, Lloyd George and Balfour, appointed as his Foreign Secretary, favoured a post-war partition of the Ottoman Empire as a major British war aim, whereas Asquith and his Foreign Secretary, Sir Edward Grey, had favoured its reform. Two days after taking office, Lloyd George told General Robertson, the Chief of the Imperial General Staff, that he wanted a major victory, preferably the capture of Jerusalem, to impress British public opinion, and immediately consulted his War Cabinet about a "further campaign into Palestine when El Arish had been secured." Subsequent pressure from Lloyd George, over the reservations of Robertson, resulted in the recapture of the Sinai for British-controlled Egypt, and, with the capture of El Arish in December 1916 and Rafah in January 1917, the arrival of British forces at the southern borders of the Ottoman Empire. Following two unsuccessful attempts to capture Gaza between 26 March and 19 April, a six-month stalemate in Southern Palestine began; the Sinai and Palestine Campaign would not make any progress into Palestine until 31October 1917. 1917: British-Zionist formal negotiations. Following the change in government, Sykes was promoted into the War Cabinet Secretariat with responsibility for Middle Eastern affairs. In January 1917, despite having previously built a relationship with Moses Gaster, he began looking to meet other Zionist leaders; by the end of the month he had been introduced to Weizmann and his associate Nahum Sokolow, a journalist and executive of the World Zionist Organization who had moved to Britain at the beginning of the war. On 7February 1917, Sykes, claiming to be acting in a private capacity, entered into substantive discussions with the Zionist leadership. The previous British correspondence with "the Arabs" was discussed at the meeting; Sokolow's notes record Sykes's description that "The Arabs professed that language must be the measure [by which control of Palestine should be determined] and [by that measure] could claim all Syria and Palestine. Still the Arabs could be managed, particularly if they received Jewish support in other matters." At this point the Zionists were still unaware of the Sykes–Picot Agreement, although they had their suspicions. One of Sykes's goals was the mobilization of Zionism to the cause of British suzerainty in Palestine, so as to have arguments to put to France in support of that objective. Late 1917: Progress of the wider war. During the period of the British War Cabinet discussions leading up to the declaration, the war had reached a period of stalemate. On the Western Front the tide would first turn in favour of the Central Powers in spring 1918, before decisively turning in favour of the Allies from July 1918 onwards. Although the United States declared war on Germany in the spring of 1917, it did not suffer its first casualties until 2 November 1917, at which point President Woodrow Wilson still hoped to avoid dispatching large contingents of troops into the war. The Russian forces were known to be distracted by the ongoing Russian Revolution and the growing support for the Bolshevik faction, but Alexander Kerensky's Provisional Government had remained in the war; Russia only withdrew after the final stage of the revolution on 7November 1917. Approvals. April to June: Allied discussions. Balfour met Weizmann at the Foreign Office on 22 March 1917; two days later, Weizmann described the meeting as being "the first time I had a real business talk with him". Weizmann explained at the meeting that the Zionists had a preference for a British protectorate over Palestine, as opposed to an American, French or international arrangement; Balfour agreed, but warned that "there may be difficulties with France and Italy". The French position in regard to Palestine and the wider Syria region during the lead up to the Balfour Declaration was largely dictated by the terms of the Sykes–Picot Agreement and was complicated from 23 November 1915 by increasing French awareness of the British discussions with the Sherif of Mecca. Prior to 1917, the British had led the fighting on the southern border of the Ottoman Empire alone, given their neighbouring Egyptian colony and the French preoccupation with the fighting on the Western Front that was taking place on their own soil. Italy's participation in the war, which began following the April 1915 Treaty of London, did not include involvement in the Middle Eastern sphere until the April 1917 Agreement of Saint-Jean-de-Maurienne; at this conference, Lloyd George had raised the question of a British protectorate of Palestine and the idea "had been very coldly received" by the French and the Italians. In May and June 1917, the French and Italians sent detachments to support the British as they built their reinforcements in preparation for a renewed attack on Palestine. In early April, Sykes and Picot were appointed to act as the chief negotiators once more, this time on a month-long mission to the Middle East for further discussions with the Sherif of Mecca and other Arab leaders. On 3 April 1917, Sykes met with Lloyd George, Lord Curzon and Maurice Hankey to receive his instructions in this regard, namely to keep the French onside while "not prejudicing the Zionist movement and the possibility of its development under British auspices, [and not] enter into any political pledges to the Arabs, and particularly none in regard to Palestine". Before travelling to the Middle East, Picot, via Sykes, invited Nahum Sokolow to Paris to educate the French government on Zionism. Sykes, who had prepared the way in correspondence with Picot, arrived a few days after Sokolow; in the meantime, Sokolow had met Picot and other French officials, and convinced the French Foreign Office to accept for study a statement of Zionist aims "in regard to facilities of colonization, communal autonomy, rights of language and establishment of a Jewish chartered company." Sykes went on ahead to Italy and had meetings with the British ambassador and British Vatican representative to prepare the way for Sokolow once again. Sokolow was granted an audience with Pope Benedict XV on 6 May 1917. Sokolow's notes of the meeting – the only meeting records known to historians – stated that the Pope expressed general sympathy and support for the Zionist project. On 21 May 1917 Angelo Sereni, president of the Committee of the Jewish Communities, presented Sokolow to Sidney Sonnino, the Italian Minister of Foreign Affairs. He was also received by Paolo Boselli, the Italian prime minister. Sonnino arranged for the secretary general of the ministry to send a letter to the effect that, although he could not express himself on the merits of a program which concerned all the allies, "generally speaking" he was not opposed to the legitimate claims of the Jews. On his return journey, Sokolow met with French leaders again and secured a letter dated 4 June 1917, giving assurances of sympathy towards the Zionist cause by Jules Cambon, head of the political section of the French foreign ministry. This letter was not published, but was deposited at the British Foreign Office. Following the United States' entry into the war on 6 April, the British Foreign Secretary led the Balfour Mission to Washington, D.C., and New York, where he spent a month between mid-April and mid-May. During the trip he spent significant time discussing Zionism with Louis Brandeis, a leading Zionist and a close ally of Wilson who had been appointed as a Supreme Court Justice a year previously. June and July: Decision to prepare a declaration. By 13 June 1917, it was acknowledged by Ronald Graham, head of the Foreign Office's Middle Eastern affairs department, that the three most relevant politiciansthe Prime Minister, the Foreign Secretary, and the Parliamentary Under-Secretary of State for Foreign Affairs, Lord Robert Cecilwere all in favour of Britain supporting the Zionist movement; on the same day Weizmann had written to Graham to advocate for a public declaration. Six days later, at a meeting on 19June, Balfour asked Lord Rothschild and Weizmann to submit a formula for a declaration. Over the next few weeks, a 143-word draft was prepared by the Zionist negotiating committee, but it was considered too specific on sensitive areas by Sykes, Graham and Rothschild. Separately, a very different draft had been prepared by the Foreign Office, described in 1961 by Harold Nicolson – who had been involved in preparing the draft – as proposing a "sanctuary for Jewish victims of persecution". The Foreign Office draft was strongly opposed by the Zionists, and was discarded; no copy of the draft has been found in the Foreign Office archives. Following further discussion, a revised – and at just 46 words in length, much shorter – draft declaration was prepared and sent by Lord Rothschild to Balfour on 18 July. It was received by the Foreign Office, and the matter was brought to the Cabinet for formal consideration. September and October: American consent and War Cabinet approval. The decision to release the declaration was taken by the British War Cabinet on 31 October 1917. This followed discussion at four War Cabinet meetings (including the 31 October meeting) over the space of the previous two months. In order to aid the discussions, the War Cabinet Secretariat, led by Maurice Hankey, the Cabinet Secretary and supported by his Assistant Secretaries – primarily Sykes and his fellow Conservative MP and pro-Zionist Leo Amery – solicited outside perspectives to put before the Cabinet. These included the views of government ministers, war allies – notably from President Woodrow Wilson – and in October, formal submissions from six Zionist leaders and four non-Zionist Jews. British officials asked President Wilson for his consent on the matter on two occasions – first on 3 September, when he replied the time was not ripe, and later on 6 October, when he agreed with the release of the declaration. Excerpts from the minutes of these four War Cabinet meetings provide a description of the primary factors that the ministers considered: Drafting. Declassification of British government archives has allowed scholars to piece together the choreography of the drafting of the declaration; in his widely cited 1961 book, Leonard Stein published four previous drafts of the declaration. The drafting began with Weizmann's guidance to the Zionist drafting team on its objectives in a letter dated 20 June 1917, one day following his meeting with Rothschild and Balfour. He proposed that the declaration from the British government should state: "its conviction, its desire or its intention to support Zionist aims for the creation of a Jewish national home in Palestine; no reference must be made I think to the question of the Suzerain Power because that would land the British into difficulties with the French; it must be a Zionist declaration." A month after the receipt of the much-reduced 12 July draft from Rothschild, Balfour proposed a number of mainly technical amendments. The two subsequent drafts included much more substantial amendments: the first in a late August draft by Lord Milner – one of the original five members of Lloyd George's War Cabinet as a minister without portfolio – which reduced the geographic scope from all of Palestine to "in Palestine", and the second from Milner and Amery in early October, which added the two "safeguard clauses". Subsequent authors have debated who the "primary author" really was. In his posthumously published 1981 book "The Anglo-American Establishment", Georgetown University history professor Carroll Quigley explained his view that Lord Milner was the primary author of the declaration, and more recently, William D. Rubinstein, Professor of Modern History at Aberystwyth University, Wales, proposed Amery instead. Huneidi wrote that Ormsby-Gore, in a report he prepared for Shuckburgh, claimed authorship, together with Amery, of the final draft form. Key issues. The agreed version of the declaration, a single sentence of just 67 words, was sent on 2November 1917 in a short letter from Balfour to Walter Rothschild, for transmission to the Zionist Federation of Great Britain and Ireland. The declaration contained four clauses, of which the first two promised to support "the establishment in Palestine of a national home for the Jewish people", followed by two "safeguard clauses" with respect to "the civil and religious rights of existing non-Jewish communities in Palestine", and "the rights and political status enjoyed by Jews in any other country". The "national home for the Jewish people" vs. Jewish state. The term "national home" was intentionally ambiguous, having no legal value or precedent in international law, such that its meaning was unclear when compared to other terms such as "state". The term was intentionally used instead of "state" because of opposition to the Zionist program within the British Cabinet. According to historian Norman Rose, the chief architects of the declaration contemplated that a Jewish State would emerge in time while the Palestine Royal Commission concluded that the wording was "the outcome of a compromise between those Ministers who contemplated the ultimate establishment of a Jewish State and those who did not." Interpretation of the wording has been sought in the correspondence leading to the final version of the declaration. An official report to the War Cabinet sent by Sykes on 22 September said that the Zionists did "not" want "to set up a Jewish Republic or any other form of state in Palestine or in any part of Palestine" but rather preferred some form of protectorate as provided in the Palestine Mandate. A month later, Curzon produced a memorandum circulated on 26 October 1917 where he addressed two questions, the first concerning the meaning of the phrase "a National Home for the Jewish race in Palestine"; he noted that there were different opinions ranging from a fully fledged state to a merely spiritual centre for the Jews. Sections of the British press assumed that a Jewish state was intended even before the Declaration was finalized. In the United States the press began using the terms "Jewish National Home", "Jewish State", "Jewish republic" and "Jewish Commonwealth" interchangeably. Treaty expert David Hunter Miller, who was at the conference and subsequently compiled a 22 volume compendium of documents, provides a report of the Intelligence Section of the American Delegation to the Paris Peace Conference of 1919 which recommended that "there be established a separate state in Palestine," and that "it will be the policy of the League of Nations to recognize Palestine as a Jewish state, as soon as it is a Jewish state in fact." The report further advised that an independent Palestinian state under a British League of Nations mandate be created. Jewish settlement would be allowed and encouraged in this state and this state's holy sites would be under the control of the League of Nations. Indeed, the Inquiry spoke positively about the possibility of a Jewish state eventually being created in Palestine if the necessary demographics for this were to exist. Historian Matthew Jacobs later wrote that the US approach was hampered by the "general absence of specialist knowledge about the region" and that "like much of the Inquiry's work on the Middle East, the reports on Palestine were deeply flawed" and "presupposed a particular outcome of the conflict". He quotes Miller, writing about one report on the history and impact of Zionism, "absolutely inadequate from any standpoint and must be regarded as nothing more than material for a future report". Lord Robert Cecil on 2 December 1917, assured an audience that the government fully intended that "Judea [was] for the Jews." Yair Auron opines that Cecil, then a deputy Foreign Secretary representing the British Government at a celebratory gathering of the English Zionist Federation, "possibly went beyond his official brief" in saying (he cites Stein) "Our wish is that Arabian countries shall be for the Arabs, Armenia for the Armenians and Judaea for the Jews". The following October Neville Chamberlain, while chairing a Zionist meeting, discussed a "new Jewish State." At the time, Chamberlain was a Member of Parliament for Ladywood, Birmingham; recalling the event in 1939, just after Chamberlain had approved the 1939 White Paper, the Jewish Telegraph Agency noted that the Prime Minister had "experienced a pronounced change of mind in the 21 years intervening" A year later, on the Declaration's second anniversary, General Jan Smuts said that Britain "would redeem her pledge ... and a great Jewish state would ultimately rise." In similar vein, Churchill a few months later stated: At the 22 June 1921 meeting of the Imperial Cabinet, Churchill was asked by Arthur Meighen, the Canadian Prime Minister, about the meaning of the national home. Churchill said "If in the course of many years they become a majority in the country, they naturally would take it over ... pro rata with the Arab. We made an equal pledge that we would not turn the Arab off his land or invade his political and social rights". Responding to Curzon in January 1919, Balfour wrote "Weizmann has never put forward a claim for the Jewish Government of Palestine. Such a claim in my opinion is clearly inadmissible and personally I do not think we should go further than the original declaration which I made to Lord Rothschild". In February 1919, France issued a statement that it would not oppose putting Palestine under British trusteeship and the formation of a Jewish State. Friedman further notes that France's attitude went on to change; Yehuda Blum, while discussing France's "unfriendly attitude towards the Jewish national movement", notes the content of a report made by Robert Vansittart (a leading member of the British delegation to the Paris Peace Conference) to Curzon in November 1920 which said: Greece's Foreign Minister told the editor of the Salonica Jewish organ Pro-Israel that "the establishment of a Jewish State meets in Greece with full and sincere sympathy ... A Jewish Palestine would become an ally of Greece." In Switzerland, a number of noted historians including professors Tobler, Forel-Yvorne, and Rogaz, supported the idea of establishing a Jewish state, with one referring to it as "a sacred right of the Jews." While in Germany, officials and most of the press took the Declaration to mean a British sponsored state for the Jews. The British government, including Churchill, made it clear that the Declaration did not intend for the whole of Palestine to be converted into a Jewish National Home, "but that such a Home should be founded in Palestine." Emir Faisal, King of Syria and Iraq, made a formal written agreement with Zionist leader Chaim Weizmann, which was drafted by T. E. Lawrence, whereby they would try to establish a peaceful relationship between Arabs and Jews in Palestine. The 3 January 1919 Faisal–Weizmann Agreement was a short-lived agreement for Arab–Jewish cooperation on the development of a Jewish homeland in Palestine. Faisal did treat Palestine differently in his presentation to the Peace Conference on 6 February 1919 saying "Palestine, for its universal character, [should be] left on one side for the mutual consideration of all parties concerned". The agreement was never implemented. In a subsequent letter written in English by Lawrence for Faisal's signature, he explained: When the letter was tabled at the Shaw Commission in 1929, Rustam Haidar spoke to Faisal in Baghdad and cabled that Faisal had "no recollection that he wrote anything of the sort". In January 1930, Haidar wrote to a newspaper in Baghdad that Faisal: "finds it exceedingly strange that such a matter is attributed to him as he at no time would consider allowing any foreign nation to share in an Arab country". Awni Abd al-Hadi, Faisal's secretary, wrote in his memoirs that he was not aware that a meeting between Frankfurter and Faisal took place and that: "I believe that this letter, assuming that it is authentic, was written by Lawrence, and that Lawrence signed it in English on behalf of Faisal. I believe this letter is part of the false claims made by Chaim Weizmann and Lawrence to lead astray public opinion." According to Allawi, the most likely explanation for the Frankfurter letter is that a meeting took place, a letter was drafted in English by Lawrence, but that its "contents were not entirely made clear to Faisal. He then may or may not have been induced to sign it", since it ran counter to Faisal's other public and private statements at the time. A 1 March interview by Le Matin quoted Faisal as saying: This feeling of respect for other religions dictates my opinion about Palestine, our neighbor. That the unhappy Jews come to reside there and behave as good citizens of this country, our humanity rejoices given that they are placed under a Muslim or Christian government mandated by The League of Nations. If they want to constitute a state and claim sovereign rights in this region, I foresee very serious dangers. It is to be feared that there will be a conflict between them and the other races. Referring to his 1922 White Paper, Churchill later wrote that "there is nothing in it to prohibit the ultimate establishment of a Jewish State." And in private, many British officials agreed with the Zionists' interpretation that a state would be established when a Jewish majority was achieved. When Chaim Weizmann met with Churchill, Lloyd George and Balfour at Balfour's home in London on 21 July 1921, Lloyd George and Balfour assured Weizmann "that by the Declaration they had always meant an eventual Jewish State," according to Weizmann minutes of that meeting. Lloyd George stated in 1937 that it was intended that Palestine would become a Jewish Commonwealth if and when Jews "had become a definite majority of the inhabitants", and Leo Amery echoed the same position in 1946. In the UNSCOP report of 1947, the issue of home versus state was subjected to scrutiny arriving at a similar conclusion to that of Lloyd George. Scope of the national home "in Palestine". The statement that such a homeland would be found "in Palestine" rather than "of Palestine" was also deliberate. The proposed draft of the declaration contained in Rothschild's 12 July letter to Balfour referred to the principle "that Palestine should be reconstituted as the National Home of the Jewish people." In the final text, following Lord Milner's amendment, the word "reconstituted" was removed and the word "that" was replaced with "in". This text thereby avoided committing the entirety of Palestine as the National Home of the Jewish people, resulting in controversy in future years over the intended scope, especially the Revisionist Zionism sector, which claimed entirety of Mandatory Palestine and Emirate of Transjordan as Jewish Homeland This was clarified by the 1922 Churchill White Paper, which wrote that "the terms of the declaration referred to do not contemplate that Palestine as a whole should be converted into a Jewish National Home, but that such a Home should be founded 'in Palestine. The declaration did not include any geographical boundaries for Palestine. Following the end of the war, three documents – the declaration, the McMahon–Hussein correspondence and the Sykes–Picot Agreement – became the basis for the negotiations to set the boundaries of Palestine. Civil and religious rights of non-Jewish communities in Palestine. The declaration's first safeguard clause referred to protecting the civil and religious rights of non-Jews in Palestine. The clause had been drafted together with the second safeguard by Leo Amery in consultation with Lord Milner, with the intention to "go a reasonable distance to meeting the objectors, both Jewish and pro-Arab, without impairing the substance of the proposed declaration". Arabs constituted around 90% of the population of Palestine, butas stated by Ronald Storrs, Britain's Military Governor of Jerusalem between 1917 and 1920they were "not so much [named but] lumped together under the negative and humiliating definition of 'Non-Jewish Communities'". Additionally, there was no reference to protecting the political rights of this group, as there was regarding Jews in other countries. This lack of interest was frequently contrasted against the commitment to the Jewish community, with various terms used over subsequent years to regard the two obligations as linked. A heated question was whether the status of both groups had "equal weight", which the British government and the Permanent Mandates Commission held to be the case in the 1930 Passfield white paper. Balfour stated in February 1919 that Palestine was considered an exceptional case in which, referring to the local population, "we deliberately and rightly decline to accept the principle of self-determination," although he considered that the policy provided self-determination to Jews. Avi Shlaim considers this the declaration's "greatest contradiction". This principle of self-determination had been declared on numerous occasions subsequent to the declarationPresident Wilson's January 1918 Fourteen Points, Sykes's Declaration to the Seven in June 1918, the November 1918 Anglo-French Declaration, and the June 1919 Covenant of the League of Nations that had established the mandate system. In an August 1919 memo Balfour acknowledged the inconsistency among these statements, and further explained that the British had no intention of consulting the existing population of Palestine. The results of the ongoing American King–Crane Commission of Enquiry consultation of the local population – from which the British had withdrawn – were suppressed for three years until the report was leaked in 1922. Subsequent British governments have acknowledged this deficiency, in particular the 1939 committee led by the Lord Chancellor, Frederic Maugham, which concluded that the government had not been "free to dispose of Palestine without regard for the wishes and interests of the inhabitants of Palestine", and the April 2017 statement by British Foreign Office minister of state Baroness Anelay that the government acknowledged that "the Declaration should have called for the protection of political rights of the non-Jewish communities in Palestine, particularly their right to self-determination." Rights and political status of Jews in other countries. The second safeguard clause was a commitment that nothing should be done which might prejudice the rights of the Jewish communities in other countries outside of Palestine. The original drafts of Rothschild, Balfour, and Milner did not include this safeguard, which was drafted together with the preceding safeguard in early October, in order to reflect opposition from influential members of the Anglo-Jewish community. Lord Rothschild took exception to the proviso on the basis that it presupposed the possibility of a danger to non-Zionists, which he denied. The Conjoint Foreign Committee of the Board of Deputies of British Jews and the Anglo-Jewish Association had published a letter in "The Times" on 24 May 1917 entitled "Views of Anglo-Jewry", signed by the two organisations' presidents, David Lindo Alexander and Claude Montefiore, stating their view that: "the establishment of a Jewish nationality in Palestine, founded on this theory of homelessness, must have the effect throughout the world of stamping the Jews as strangers in their native lands, and of undermining their hard-won position as citizens and nationals of these lands." This was followed in late August by Edwin Montagu, an influential anti-Zionist Jew and Secretary of State for India, and the only Jewish member of the British Cabinet, who wrote in a Cabinet memorandum that: "The policy of His Majesty's Government is anti-Semitic in result and will prove a rallying ground for anti-Semites in every country of the world." Reaction. The text of the declaration was published in the press one week after it was signed, on 9November 1917. Other related events took place within a short timeframe, the two most relevant being the almost immediate British military capture of Palestine and the leaking of the previously secret Sykes–Picot Agreement. On the military side, both Gaza and Jaffa fell within several days, and Jerusalem was surrendered to the British on 9 December. The publication of the Sykes–Picot Agreement, following the Russian Revolution, in the Bolshevik "Izvestia" and "Pravda" on 23 November 1917 and in the British "Manchester Guardian" on 26 November 1917, represented a dramatic moment for the Allies' Eastern campaign: "the British were embarrassed, the Arabs dismayed and the Turks delighted." The Zionists had been aware of the outlines of the agreement since April and specifically the part relevant to Palestine, following a meeting between Weizmann and Cecil where Weizmann made very clear his objections to the proposed scheme. Zionist reaction. The declaration represented the first public support for Zionism by a major political power – its publication galvanized Zionism, which finally had obtained an official charter. In addition to its publication in major newspapers, leaflets were circulated throughout Jewish communities. These leaflets were airdropped over Jewish communities in Germany and Austria, as well as the Pale of Settlement, which had been given to the Central Powers following the Russian withdrawal. Weizmann had argued that the declaration would have three effects: it would swing Russia to maintain pressure on Germany's Eastern Front, since Jews had been prominent in the March Revolution of 1917; it would rally the large Jewish community in the United States to press for greater funding for the American war effort, underway since April of that year; and, lastly, that it would undermine German Jewish support for Kaiser Wilhelm II. The declaration spurred an unintended and extraordinary increase in the number of adherents of American Zionism; in 1914 the 200 American Zionist societies comprised a total of 7,500 members, which grew to 30,000 members in 600 societies in 1918 and 149,000 members in 1919. Whilst the British had considered that the declaration reflected a previously established dominance of the Zionist position in Jewish thought, it was the declaration itself that was subsequently responsible for Zionism's legitimacy and leadership. Exactly one month after the declaration was issued, a large-scale celebration took place at the Royal Opera House – speeches were given by leading Zionists as well as members of the British administration including Sykes and Cecil. From 1918 until the Second World War, Jews in Mandatory Palestine celebrated Balfour Day as an annual national holiday on 2November. The celebrations included ceremonies in schools and other public institutions and festive articles in the Hebrew press. In August 1919 Balfour approved Weizmann's request to name the first post-war settlement in Mandatory Palestine, "Balfouria", in his honour. It was intended to be a model settlement for future American Jewish activity in Palestine. Herbert Samuel, the Zionist MP whose 1915 memorandum had framed the start of discussions in the British Cabinet, was asked by Lloyd George on 24April 1920 to act as the first civil governor of British Palestine, replacing the previous military administration that had ruled the area since the war. Shortly after beginning the role in July 1920, he was invited to read the "haftarah" from Isaiah 40 at the Hurva Synagogue in Jerusalem, which, according to his memoirs, led the congregation of older settlers to feel that the "fulfilment of ancient prophecy might at last be at hand". Opposition in Palestine. The local Christian and Muslim community of Palestine, who constituted almost 90% of the population, strongly opposed the declaration. As described by the Palestinian-American philosopher Edward Said in 1979, it was perceived as being made: "(a)by a European power, (b)about a non-European territory, (c)in a flat disregard of both the presence and the wishes of the native majority resident in that territory, and (d)it took the form of a promise about this same territory to another foreign group." According to the 1919 King–Crane Commission, "No British officer, consulted by the Commissioners, believed that the Zionist programme could be carried out except by force of arms." A delegation of the Muslim-Christian Association, headed by Musa al-Husayni, expressed public disapproval on 3November 1918, one day after the Zionist Commission parade marking the first anniversary of the Balfour Declaration. They handed a petition signed by more than 100 notables to Ronald Storrs, the British military governor: The group also protested the carrying of new "white and blue banners with two inverted triangles in the middle", drawing the attention of the British authorities to the serious consequences of any political implications in raising the banners. Later that month, on the first anniversary of the occupation of Jaffa by the British, the Muslim-Christian Association sent a lengthy memorandum and petition to the military governor protesting once more any formation of a Jewish state. The majority of Britain's military leaders considered Balfour's declaration either a mistake, or one that presented grave risks. Broader Arab response. In the broader Arab world, the declaration was seen as a betrayal of the British wartime understandings with the Arabs. The Sharif of Mecca and other Arab leaders considered the declaration a violation of a previous commitment made in the McMahon–Hussein correspondence in exchange for launching the Arab Revolt. Following the publication of the declaration in an Egyptian newspaper, "Al Muqattam", the British dispatched Commander David George Hogarth to see Hussein in January 1918 bearing the message that the "political and economic freedom" of the Palestinian population was not in question. Hogarth reported that Hussein "would not accept an independent Jewish State in Palestine, nor was I instructed to warn him that such a state was contemplated by Great Britain". Hussein had also learned of the Sykes–Picot Agreement when it was leaked by the new Soviet government in December 1917, but was satisfied by two disingenuous messages from Sir Reginald Wingate, who had replaced McMahon as High Commissioner of Egypt, assuring him that the British commitments to the Arabs were still valid and that the Sykes–Picot Agreement was not a formal treaty. Continuing Arab disquiet over Allied intentions also led during 1918 to the British Declaration to the Seven and the Anglo-French Declaration, the latter promising "the complete and final liberation of the peoples who have for so long been oppressed by the Turks, and the setting up of national governments and administrations deriving their authority from the free exercise of the initiative and choice of the indigenous populations". In 1919, King Hussein refused to ratify the Treaty of Versailles. After February 1920, the British ceased to pay subsidy to him. In August 1920, five days after the signing of the Treaty of Sèvres, which formally recognized the Kingdom of Hejaz, Curzon asked Cairo to procure Hussein's signature to both treaties and agreed to make a payment of £30,000 conditional on signature. Hussein declined and in 1921, stated that he could not be expected to "affix his name to a document assigning Palestine to the Zionists and Syria to foreigners." Following the 1921 Cairo Conference, Lawrence was sent to try and obtain the King's signature to a treaty as well as to Versailles and Sèvres, a £60,000 annual subsidy being proposed; this attempt also failed. During 1923, the British made one further attempt to settle outstanding issues with Hussein and once again, the attempt foundered, Hussein continued in his refusal to recognize the Balfour Declaration or any of the Mandates that he perceived as being his domain. In March 1924, having briefly considered the possibility of removing the offending article from the treaty, the government suspended any further negotiations; within six months they withdrew their support in favour of their central Arabian ally Ibn Saud, who proceeded to conquer Hussein's kingdom. Allies and Associated Powers. The declaration was first endorsed by a foreign government on 27 December 1917, when Serbian Zionist leader and diplomat David Albala announced the support of Serbia's government in exile during a mission to the United States. The French and Italian governments offered their endorsements, on 14 February and 9 May 1918, respectively. At a private meeting in London on 1 December 1918, Lloyd George and French Prime Minister Georges Clemenceau agreed to certain modifications to the Sykes–Picot Agreement, including British control of Palestine. On 25 April 1920, the San Remo conference – an outgrowth of the Paris Peace Conference attended by the prime ministers of Britain, France and Italy, the , and the United States Ambassador to Italy – established the basic terms for three League of Nations mandates: a French mandate for Syria, and British mandates for Mesopotamia and Palestine. With respect to Palestine, the resolution stated that the British were responsible for putting into effect the terms of the Balfour Declaration. The French and the Italians made clear their dislike of the "Zionist cast of the Palestinian mandate" and objected especially to language that did not safeguard the "political" rights of non-Jews, accepting Curzon's claim that "in the British language all ordinary rights were included in "civil rights"". At the request of France, it was agreed that an undertaking was to be inserted in the mandate's procès-verbal that this would not involve the surrender of the rights hitherto enjoyed by the non-Jewish communities in Palestine. The Italian endorsement of the Declaration had included the condition "... on the understanding that there is no prejudice against the legal and political status of the already existing religious communities ..." The boundaries of Palestine were left unspecified, to "be determined by the Principal Allied Powers." Three months later, in July 1920, the French defeat of Faisal's Arab Kingdom of Syria precipitated the British need to know "what is the 'Syria' for which the French received a mandate at San Remo?" and "does it include Transjordania?" – it subsequently decided to pursue a policy of associating Transjordan with the mandated area of Palestine without adding it to the area of the Jewish National Home. In 1922, Congress officially endorsed America's support for the Balfour Declaration through the passage of the Lodge–Fish Resolution, notwithstanding opposition from the State Department. Professor Lawrence Davidson, of West Chester University, whose research focuses on American relations with the Middle East, argues that President Wilson and Congress ignored democratic values in favour of "biblical romanticism" when they endorsed the declaration. He points to an organized pro-Zionist lobby in the United States, which was active at a time when the country's small Arab American community had little political power. Central Powers. The publication of the Balfour Declaration was met with tactical responses from the Central Powers; however the participation of the Ottoman Empire in the alliance meant that Germany was unable to effectively counter the British pronouncement. Two weeks following the declaration, Ottokar Czernin, the Austrian Foreign Minister, gave an interview to Arthur Hantke, President of the Zionist Federation of Germany, promising that his government would influence the Turks once the war was over. On 12December, the Ottoman Grand Vizier, Talaat Pasha, gave an interview to the German newspaper "Vossische Zeitung" that was published on 31December and subsequently released in the German-Jewish periodical "Jüdische Rundschau" on 4January 1918, in which he referred to the declaration as "une blague" (a deception) and promised that under Ottoman rule "all justifiable wishes of the Jews in Palestine would be able to find their fulfilment" subject to the absorptive capacity of the country. This Turkish statement was endorsed by the German Foreign Office on 5January 1918. On 8January 1918, a German-Jewish Society, the Union of German Jewish Organizations for the Protection of the Rights of the Jews of the East, was formed to advocate for further progress for Jews in Palestine. Following the war, the Treaty of Sèvres was signed by the Ottoman Empire on 10August 1920. The treaty dissolved the Ottoman Empire, requiring Turkey to renounce sovereignty over much of the Middle East. Article95 of the treaty incorporated the terms of the Balfour Declaration with respect to "the administration of Palestine, within such boundaries as may be determined by the Principal Allied Powers". Since incorporation of the declaration into the Treaty of Sèvres did not affect the legal status of either the declaration or the Mandate, there was also no effect when Sèvres was superseded by the Treaty of Lausanne, which did not include any reference to the declaration. In 1922, German anti-Semitic theorist Alfred Rosenberg in his primary contribution to Nazi theory on Zionism, "Der Staatsfeindliche Zionismus" ("Zionism, the Enemy of the State"), accused German Zionists of working for a German defeat and supporting Britain and the implementation of the Balfour Declaration, in a version of the stab-in-the-back myth. Adolf Hitler took a similar approach in some of his speeches from 1920 onwards. The Holy See. With the advent of the declaration and the British entry into Jerusalem on 9 December, the Vatican reversed its earlier sympathetic attitude to Zionism and adopted an oppositional stance that was to continue until the early 1990s. Evolution of British opinion. The British policy as stated in the declaration was to face numerous challenges to its implementation in the following years. The first of these was the indirect peace negotiations which took place between Britain and the Ottomans in December 1917 and January 1918 during a pause in the hostilities for the rainy season; although these peace talks were unsuccessful, archival records suggest that key members of the War Cabinet may have been willing to permit leaving Palestine under nominal Turkish sovereignty as part of an overall deal. In October 1919, almost a year after the end of the war, Lord Curzon succeeded Balfour as Foreign Secretary. Curzon had been a member of the 1917 Cabinet that had approved the declaration, and according to British historian Sir David Gilmour, Curzon had been "the only senior figure in the British government at the time who foresaw that its policy would lead to decades of Arab–Jewish hostility". He therefore determined to pursue a policy in line with its "narrower and more prudent rather than the wider interpretation". Following Bonar Law's appointment as Prime Minister in late 1922, Curzon wrote to Law that he regarded the declaration as "the worst" of Britain's Middle East commitments and "a striking contradiction of our publicly declared principles". In August 1920 the report of the Palin Commission, the first in a long line of British Commissions of Inquiry on the question of Palestine during the Mandate period, noted that "The Balfour Declaration ... is undoubtedly the starting point of the whole trouble". The conclusion of the report, which was not published, mentioned the Balfour Declaration three times, stating that "the causes of the alienation and exasperation of the feelings of the population of Palestine" included: British public and government opinion became increasingly unfavourable to state support for Zionism; even Sykes had begun to change his views in late 1918. In February 1922 Churchill telegraphed Samuel, who had begun his role as High Commissioner for Palestine 18 months earlier, asking for cuts in expenditure and noting: Following the issuance of the Churchill White Paper in June 1922, the House of Lords rejected a Palestine Mandate that incorporated the Balfour Declaration by 60 votes to 25, following a motion issued by Lord Islington. The vote proved to be only symbolic as it was subsequently overruled by a vote in the House of Commons following a tactical pivot and variety of promises made by Churchill. In February 1923, following the change in government, Cavendish, in a lengthy memorandum for the Cabinet, laid the foundation for a secret review of Palestine policy: His covering note asked for a statement of policy to be made as soon as possible and that the cabinet ought to focus on three questions: (1) whether or not pledges to the Arabs conflict with the Balfour declaration; (2) if not, whether the new government should continue the policy set down by the old government in the 1922 White Paper; and (3) if not, what alternative policy should be adopted. Stanley Baldwin, replacing Bonar Law as Prime Minister, in June 1923 set up a cabinet sub-committee whose terms of reference were: The Cabinet approved the report of this committee on 31 July 1923. Describing it as "nothing short of remarkable", Quigley noted that the government was admitting to itself that its support for Zionism had been prompted by considerations having nothing to do with the merits of Zionism or its consequences for Palestine. As Huneidi noted, "wise or unwise, it is well nigh impossible for any government to extricate itself without a substantial sacrifice of consistency and self-respect, if not honour." The wording of the declaration was thus incorporated into the British Mandate for Palestine, a legal instrument that created Mandatory Palestine with an explicit purpose of putting the declaration into effect and was finally formalized in September 1923. Unlike the declaration itself, the Mandate was legally binding on the British government. In June 1924, Britain made its report to the Permanent Mandates Commission for the period July 1920 to the end of 1923 containing nothing of the candor reflected in the internal documents; the documents relating to the 1923 reappraisal stayed secret until the early 1970s. Historiography and motivations. Lloyd George and Balfour remained in government until the collapse of the coalition in October 1922. Under the new Conservative government, attempts were made to identify the background to and motivations for the declaration. A private Cabinet memorandum was produced in January 1923, providing a summary of the then-known Foreign Office and War Cabinet records leading up to the declaration. An accompanying Foreign Office note asserted that the primary authors of the declaration were Balfour, Sykes, Weizmann, and Sokolow, with "perhaps Lord Rothschild as a figure in the background", and that "negotiations seem to have been mainly oral and by means of private notes and memoranda of which only the scantiest records seem to be available." Following the 1936 general strike that was to degenerate into the 1936–1939 Arab revolt in Palestine, the most significant outbreak of violence since the Mandate began, a British Royal Commission – a high-profile public inquiry – was appointed to investigate the causes of the unrest. The Palestine Royal Commission, appointed with significantly broader terms of reference than the previous British inquiries into Palestine, completed its 404-page report after six months of work in June 1937, publishing it a month later. The report began by describing the history of the problem, including a detailed summary of the origins of the Balfour Declaration. Much of this summary relied on Lloyd-George's personal testimony; Balfour had died in 1930 and Sykes in 1919. He told the commission that the declaration was made "due to propagandist reasons ... In particular Jewish sympathy would confirm the support of American Jewry, and would make it more difficult for Germany to reduce her military commitments and improve her economic position on the eastern front". Two years later, in his "Memoirs of the Peace Conference", Lloyd George described a total of nine factors motivating his decision as Prime Minister to release the declaration, including the additional reasons that a Jewish presence in Palestine would strengthen Britain's position on the Suez Canal and reinforce the route to their imperial dominion in India. These geopolitical calculations were debated and discussed in the following years. Historians agree that the British believed that expressing support would appeal to Jews in Germany and the United States, given two of Woodrow Wilson's closest advisors were known to be avid Zionists; they also hoped to encourage support from the large Jewish population in Russia. In addition, the British intended to pre-empt the expected French pressure for an international administration in Palestine. Some historians argue that the British government's decision reflected what James Gelvin, Professor of Middle Eastern History at UCLA, calls 'patrician anti-Semitism' in the overestimation of Jewish power in both the United States and Russia. American Zionism was still in its infancy; in 1914 the Zionist Federation had a small budget of about $5,000 and only 12,000 members, despite an American Jewish population of three million but the Zionist organizations had recently succeeded, following a show of force within the American Jewish community, in arranging a Jewish congress to debate the Jewish problem as a whole. This impacted British and French government estimates of the balance of power within the American Jewish public. Avi Shlaim, emeritus Professor of International Relations in the University of Oxford, asserts that two main schools of thought have been developed on the question of the primary driving force behind the declaration, one presented in 1961 by Leonard Stein, a lawyer and former political secretary to the World Zionist Organization, and the other in 1970 by Mayir Vereté, then Professor of Israeli History at the Hebrew University of Jerusalem. Shlaim states that Stein does not reach any clear cut conclusions, but that implicit in his narrative is that the declaration resulted primarily from the activity and skill of the Zionists, whereas according to Vereté, it was the work of hard-headed pragmatists motivated by British imperial interests in the Middle East. Much of modern scholarship on the decision to issue the declaration focuses on the Zionist movement and rivalries within it, with a key debate being whether the role of Weizmann was decisive or whether the British were likely to have issued a similar declaration in any event. Danny Gutwein, Professor of Jewish History at the University of Haifa, proposes a twist on an old idea, asserting that Sykes's February 1917 approach to the Zionists was the defining moment, and that it was consistent with the pursuit of the government's wider agenda to partition the Ottoman Empire. Historian J. C. Hurewitz has written that British support for a Jewish homeland in Palestine was part of an effort to secure a land bridge between Egypt and the Persian Gulf by annexing territory from the Ottoman Empire. Long-term impact. The declaration had two indirect consequences, the emergence of Israel and a chronic state of conflict between Arabs and Jews throughout the Middle East. It has been described as the "original sin" with respect to both Britain's failure in Palestine and for wider events in Palestine. The statement also had a significant impact on the traditional anti-Zionism of religious Jews, some of whom saw it as divine providence; this contributed to the growth of religious Zionism amid the larger Zionist movement. Starting in 1920, intercommunal conflict in Mandatory Palestine broke out, which widened into the regional Arab–Israeli conflict, often referred to as the world's "most intractable conflict". The "dual obligation" to the two communities quickly proved to be untenable; the British subsequently concluded that it was impossible for them to pacify the two communities in Palestine by using different messages for different audiences. The Palestine Royal Commission – in making the first official proposal for partition of the region – referred to the requirements as "contradictory obligations", and that the "disease is so deep-rooted that, in our firm conviction, the only hope of a cure lies in a surgical operation". Following the 1936–1939 Arab revolt in Palestine, and as worldwide tensions rose in the buildup to the Second World War, the British Parliament approved the White Paper of 1939 – their last formal statement of governing policy in Mandatory Palestine – declaring that Palestine should not become a Jewish State and placing restrictions on Jewish immigration. Whilst the British considered this consistent with the Balfour Declaration's commitment to protect the rights of non-Jews, many Zionists saw it as a repudiation of the declaration. Although this policy lasted until the British surrendered the Mandate in 1948, it served only to highlight the fundamental difficulty for Britain in carrying out the Mandate obligations. Britain's involvement in this became one of the most controversial parts of its Empire's history and damaged its reputation in the Middle East for generations. According to historian Elizabeth Monroe: "measured by British interests alone, [the declaration was] one of the greatest mistakes in [its] imperial history." The 2010 study by Jonathan Schneer, specialist in modern British history at Georgia Tech, concluded that because the build-up to the declaration was characterized by "contradictions, deceptions, misinterpretations, and wishful thinking", the declaration sowed dragon's teeth and "produced a murderous harvest, and we go on harvesting even today". The foundational stone for modern Israel had been laid, but the prediction that this would lay the groundwork for harmonious Arab-Jewish cooperation proved to be wishful thinking. On the bicentenary of its foundation, the British newspaper "The Guardian", reflecting on its major errors of judgment, included the support the paper's editor, C. P. Scott, gave to Balfour's declaration. Israel had not become, it said, 'the country the Guardian foresaw or would have wanted.' The Board of Deputies of British Jews through its president Marie van der Zyl denounced the column as 'breathtakingly ill-considered', declaring that the Guardian appeared "to do everything it can to undermine the legitimacy of the world's only Jewish state". The document. The document was presented to the British Museum in 1924 by Walter Rothschild; today it is held in the British Library, which separated from the British Museum in 1973, as Additional Manuscripts number 41178. From October 1987 to May 1988 it was lent outside the UK for display in Israel's Knesset.
4821
1288292565
https://en.wikipedia.org/wiki?curid=4821
Black Hand (Serbia)
Unification or Death (), popularly known as the Black Hand (), was a secret military society formed in May 1911 by officers in the Army of the Kingdom of Serbia. It gained a reputation for its alleged involvement in the assassination of Archduke Franz Ferdinand in Sarajevo in 1914 and for the earlier assassination of the Serbian royal couple in 1903, under the aegis of Captain Dragutin Dimitrijević ( "Apis"). The society formed to unite all of the territories with a South Slavic majority that were not then ruled by either Serbia or Montenegro. It took inspiration primarily from the unification of Italy in 1859–1870, but also from the unification of Germany in 1871. Through its connections to the June 1914 assassination of Archduke Franz Ferdinand in Sarajevo, carried out by the members of the youth movement Young Bosnia, the Black Hand was instrumental in starting World War I (1914–1918) by precipitating the July Crisis of 1914, which eventually led to Austria-Hungary's invasion of the Kingdom of Serbia in August 1914. Background. Apis' conspiracy group and the May Coup. In August 1901, a group of lower officers headed by captain Dragutin Dimitrijević "Apis" established a conspiracy group (called the Black Hand in literature), against the dynasty. The first meeting was held on 6 September 1901. In attendance were captains Radomir Aranđelović, Milan F. Petrović, and Dragutin Dimitrijević, as well as lieutenants Antonije Antić, Dragutin Dulić, Milan Marinković, and Nikodije Popović. They made a plan to kill the royal couple—King Alexander I Obrenović and Queen Draga. On the night of 28/29 May 1903 (Old Style), Captain Apis personally led a group of Army officers who murdered the royal couple at the Old Palace in Belgrade. Along with the royal couple, the conspirators killed Prime Minister Dimitrije Cincar-Marković, Minister of the Army Milovan Pavlović, and General-Adjutant Lazar Petrović. This became known as the May Coup. National defense. On 8 October 1908, just two days after Austria annexed Bosnia and Herzegovina, Serbian ministers, officials, and generals held a meeting at the City Hall in Belgrade. They founded a semi-secret society, the "Narodna Odbrana" ("National Defense") which gave Pan-Serbism a focus and an organization. The purpose of the group was to liberate Serbs under the Austro-Hungarian occupation. They also shared anti-Austrian propaganda and organized spies and saboteurs to operate within the occupied provinces. Satellite groups were formed in Slovenia, Bosnia, Herzegovina, and Istria. The Bosnian group became deeply associated with local groups of pan-Serb activists such as "Mlada Bosna" ("Young Bosnia"). Establishment. Unification or Death was established at the beginning of May 1911, and the original constitution of the organization was signed on 9 May. Ljuba Čupa, Bogdan Radenković, and Vojislav Tankosić wrote the constitution of the organization, modeled after similar German secret nationalist associations and the Italian Carbonari. The organization was mentioned in the Serbian parliament as the "Black Hand" in late 1911. By 1911–12, Narodna Odbrana had established ties with the Black Hand, and the two became "parallel in action and overlapping in membership". 1911–13. The organization used the magazine "Pijemont" (the Serbian name for Piedmont, the kingdom that led the unification of Italy under the House of Savoy) for the dissemination of their ideas. The magazine was founded by Ljuba Čupa in August 1911. 1914. By 1914, the group had hundreds of members, many of them Serbian Army officers. The goal of uniting Serb-inhabited territories was implemented by training guerilla fighters and saboteurs. The Black Hand was organized at the grassroots level in cells of three to five members, supervised by district committees and by a Central Committee in Belgrade, whose ten-member executive committee was primarily led by Colonel Dragutin Dimitrijević "Apis". To ensure secrecy, members rarely knew much more than the other members of their own cell and one superior above them. New members swore the oath: The Black Hand took over the terrorist actions of "Narodna Odbrana" and deliberately worked to obscure any distinctions between the two groups, trading on the prestige and network of the older organization. Black Hand members held important army and government positions. Crown Prince Alexander was an enthusiastic financial supporter. The group held influence over government appointments and policies. The Serbian government was fairly well-informed of Black Hand activities. Friendly relations had fairly well cooled by 1914. The Black Hand was displeased with Prime Minister Nikola Pašić and thought that he did not act aggressively enough for the Pan-Serb cause. The Black Hand engaged in a bitter power struggle over several issues, such as who would control territories that Serbia had annexed during the Balkan Wars. By then, disagreeing with the Black Hand was dangerous, as political murder was one of its tools. In 1914, Apis allegedly decided that Archduke Franz Ferdinand, the heir-apparent of Austria, should be assassinated, as he was trying to pacify the Serbians, which would prevent a revolution if he was successful. Towards that end, three young Bosnian Serbs were allegedly recruited to kill the Archduke. They were certainly trained in bomb throwing and marksmanship by current and former members of the Serbian military. Gavrilo Princip, Nedeljko Čabrinović, and Trifko Grabež were smuggled across the border back into Bosnia by a chain of contacts similar to the Underground Railroad. The decision to kill the Archduke was initiated by Apis and not sanctioned by the full Executive Committee (if Apis was involved at all, a question that remains in dispute). Legacy. The Black Hand was outlawed in Serbia in 1917, although its ideas continued to be influential after World War One and Two. An organization known as the White Hand was created and inspired by it. In 1938, Konspiracija, a conspiracy group to overthrow the Yugoslav regency was founded by, among others, members of the Serbian Cultural Club (SKK). The organization was modeled after the Black Hand, including the recruitment process. Two members of the Black Hand, Antonije Antić and Velimir Vemić, were the organization's military advisors.
4822
7903804
https://en.wikipedia.org/wiki?curid=4822
Board of directors
A board of directors is a governing body that supervises the activities of a business, a nonprofit organization, or a government agency. The powers, duties, and responsibilities of a board of directors are determined by government regulations (including the jurisdiction's corporate law) and the organization's own constitution and by-laws. These authorities may specify the number of members of the board, how they are to be chosen, and how often they are to meet. In an organization with voting members, the board is accountable to, and may be subordinate to, the organization's full membership, which usually elect the members of the board. In a stock corporation, non-executive directors are elected by the shareholders, and the board has ultimate responsibility for the management of the corporation. In nations with codetermination (such as Germany and Sweden), the workers of a corporation elect a set fraction of the board's members. The board of directors appoints the chief executive officer of the corporation and sets out the overall strategic direction. In corporations with dispersed ownership, the identification and nomination of directors (that shareholders vote for or against) are often done by the board itself, leading to a high degree of self-perpetuation. In a non-stock corporation with no general voting membership, the board is the supreme governing body of the institution, and its members are sometimes chosen by the board itself. Terminology. Other names include "board of directors and advisors", "board of governors", "board of managers", "board of regents", "board of trustees", and "board of visitors". It may also be called the "executive board". Roles. Typical duties of boards of directors include: The legal responsibilities of boards and board members vary with the nature of the organization, and between jurisdictions. For companies with shares publicly listed for negotiation, these responsibilities are typically much more rigorous and complex than for those of other types. Typically, the board chooses one of its members to be the "chairman" (often now called the "chair" or "chairperson"), who holds whatever title is specified in the by-laws or articles of association. However, in membership organizations, the members elect the president of the organization and the president becomes the board chair, unless the by-laws say otherwise. Directors. The directors of an organization are the persons who are members of its board. Several specific terms categorize directors by the presence or absence of their other relationships to the organization. Honorary members. Corporations may designate a former senior executive and ex-board member as an honorary board member, a position that does not carry any executive authority but represents recognition of the person's corporate governorship and performance. Inside director. An inside director is a director who is also an employee, officer, chief executive, major shareholder, or someone similarly connected to the organization. Inside directors represent the interests of the entity's stakeholders, and often have special knowledge of its inner workings, its financial or market position, and so on. Typical inside directors are: An inside director who is employed as a manager or executive of the organization is sometimes referred to as an executive director (not to be confused with the title executive director sometimes used for the CEO position in some organizations). Executive directors often have a specified area of responsibility in the organization, such as finance, marketing, human resources, or production. Outside director. An outside director is a member of the board who is not otherwise employed by or engaged with the organization, and does not represent any of its stakeholders. A typical example is a director who is president of a firm in a different industry. Outside directors are not employees of the company or affiliated with it in any other way. Outside directors bring outside experience and perspectives to the board. For example, for a company that serves a domestic market only, the presence of CEOs from global multinational corporations as outside directors can help to provide insights on export and import opportunities and international trade options. One of the arguments for having outside directors is that they can keep a watchful eye on the inside directors and on the way the organization is run. Outside directors are unlikely to tolerate "insider dealing" between inside directors, as outside directors do not benefit from the company or organization. Outside directors are often useful in handling disputes between inside directors, or between shareholders and the board. They are thought to be advantageous because they can be objective and present little risk of conflict of interest. On the other hand, they might lack familiarity with the specific issues connected to the organization's governance, and they might not know about the industry or sector in which the organization is operating. Terminology. Individual directors often serve on more than one board. This practice results in an interlocking directorate, where a relatively small number of individuals have significant influence over many important entities. This situation can have important corporate, social, economic, and legal consequences, and has been the subject of significant research. Process and structure. The process for running a board, sometimes called the board process, includes the selection of board members, the setting of clear board objectives, the dissemination of documents or board package to the board members, the collaborative creation of an agenda for the meeting, the creation and follow-up of assigned action items, and the assessment of the board process through standardized assessments of board members, owners, and CEOs. The science of this process has been slow to develop due to the secretive nature of the way most companies run their boards, however some standardization is beginning to develop. Some who are pushing for this standardization in the US are the National Association of Corporate Directors, McKinsey and The Board Group. Board meetings. A board of directors conducts its meetings according to the rules and procedures contained in its governing documents. These procedures may allow the board to conduct its business by conference call or other electronic means. They may also specify how a quorum is to be determined. Non-corporate boards. The responsibilities of a board of directors vary depending on the nature and type of business entity and the laws applying to the entity (see types of business entity). For example, the nature of the business entity may be one that is traded on a public market (public company), not traded on a public market (a private, limited or closely held company), owned by family members (a family business), or exempt from income taxes (a non-profit, not for profit, or tax-exempt entity). There are numerous types of business entities available throughout the world such as a corporation, limited liability company, cooperative, business trust, partnership, private limited company, and public limited company. Much of what has been written about boards of directors relates to boards of directors of business entities actively traded on public markets. More recently, however, material is becoming available for boards of private and closely held businesses including family businesses. A board-only organization is one whose board is self-appointed, rather than being accountable to a base of members through elections; or in which the powers of the membership are extremely limited. Membership organizations. In membership organizations, such as a society made up of members of a certain profession or one advocating a certain cause, a board of directors may have the responsibility of running the organization in between meetings of the membership, especially if the membership meets infrequently, such as only at an annual general meeting. The amount of powers and authority delegated to the board depend on the bylaws and rules of the particular organization. Some organizations place matters exclusively in the board's control while in others, the general membership retains full power and the board can only make recommendations. The setup of a board of directors varies widely across organizations and may include provisions that are applicable to corporations, in which the "shareholders" are the members of the organization. A difference may be that the membership elects the officers of the organization, such as the president and the secretary, and the officers become members of the board in addition to the directors and retain those duties on the board. The directors may also be classified as officers in this situation. There may also be ex-officio members of the board, or persons who are members due to another position that they hold. These ex-officio members have all the same rights as the other board members. Members of the board may be removed before their term is complete. Details on how they can be removed are usually provided in the bylaws. If the bylaws do not contain such details, the section on disciplinary procedures in "Robert's Rules of Order" may be used. Corporations. In a publicly held company, directors are elected to represent and are legally obligated as fiduciaries to represent owners of the company—the shareholders/stockholders. In this capacity they establish policies and make decisions on issues such as whether there is dividend and how much it is, stock options distributed to employees, and the hiring/firing and compensation of upper management. Governance. Theoretically, the control of a company is divided between two bodies: the board of directors, and the shareholders in general meeting. In practice, the amount of power exercised by the board varies with the type of company. In small private companies, the directors and the shareholders are normally the same people, and thus there is no real division of power. In large public companies, the board tends to exercise more of a supervisory role, and individual responsibility and management tends to be delegated downward to individual professional executives (such as a finance director or a marketing director) who deal with particular areas of the company's affairs. Another feature of boards of directors in large public companies is that the board tends to have more "de facto" power. Most shareholders do not attend shareholder meetings, but rather cast proxy votes via mail, phone, or internet, thus allowing the board to vote for them. However, proxy votes are not a total delegation of the voting power, as the board must vote the proxy shares as directed by their owner even when it contradicts the board's views. In addition, many shareholders vote to accept all recommendations of the board rather than try to get involved in management, since each shareholder's power, as well as interest and information is so small. Larger institutional investors also grant the board proxies. The large number of shareholders also makes it hard for them to organize. However, there have been moves recently to try to increase shareholder activism among both institutional investors and individuals with small shareholdings. A contrasting view is that in large public companies it is upper management and not boards that wield practical power, because boards delegate nearly all of their power to the top executive employees, adopting their recommendations almost without fail. As a practical matter, executives even choose the directors, with shareholders normally following management recommendations and voting for them. In most cases, serving on a board is not a career unto itself. For major corporations, the board members are usually professionals or leaders in their field. In the case of outside directors, they are often senior leaders of other organizations. Nevertheless, board members often receive remunerations amounting to hundreds of thousands of dollars per year since they often sit on the boards of several companies. Inside directors are usually not paid for sitting on a board, but the duty is instead considered part of their larger job description. Outside directors are usually paid for their services. These remunerations vary between corporations, but usually consist of a yearly or monthly salary, additional compensation for each meeting attended, stock options, and various other benefits. such as travel, hotel and meal expenses for the board meetings. Tiffany & Co., for example, pays directors an annual retainer of $46,500, an additional annual retainer of $2,500 if the director is also a chairperson of a committee, a per-meeting-attended fee of $2,000 for meetings attended in person, a $500 fee for each meeting attended via telephone, in addition to stock options and retirement benefits. Academic research has identified different types of board directors. Their characteristics and experiences shape their role and performance. For instance, directors with multiple mandates are often referred to as busy directors. Their monitoring performance is considered to be comparatively weak due to the limited time they can dedicate to this task. Overconfident directors are found to pay higher premiums in corporate acquisitions and make worse takeover choices. Locally rooted directors tend to be overrepresented and lack international experience, which can lead to lower valuations, especially in internationally oriented firms. Directors' military experience is associated with rigorous monitoring and improved corporate governance. Two-tier system. In some European and Asian countries, there are two separate boards, an executive board (or management board) for day-to-day business and a supervisory board (elected by the shareholders and employees) for supervising the executive board. In these countries, the chairman of the supervisory board is equivalent to the chairman of a single-tier board, while the chairman of the management board is reckoned as the company's CEO or managing director. These two roles are always held by different people. This ensures a distinction between management by the executive board and governance by the supervisory board and allows for clear lines of authority. The aim is to prevent a conflict of interest and too much power being concentrated in the hands of one person. There is a strong parallel here with the structure of government, which tends to separate the political cabinet from the management civil service. In the United States, the board of directors (elected by the shareholders) is often equivalent to the supervisory board, while the executive board may often be known as the executive committee (operating committee or executive council), composed of the CEO and their direct reports (other C-level officers, division/subsidiary heads). Board structures and procedures vary both within and among OECD countries. Some countries have two-tier boards that separate the supervisory function and the management function into different bodies. Such systems typically have a "supervisory board" composed of nonexecutive board members and a "management board" composed entirely of executives. Other countries have "unitary" boards, which bring together executive and non-executive board members. In some countries there is also an additional statutory body for audit purposes. The OECD Principles are intended to be sufficiently general to apply to whatever board structure is charged with the functions of governing the enterprise and monitoring management. History. The development of a separate board of directors to manage/govern/oversee a company has occurred incrementally and indefinitely over legal history. Until the end of the 19th century, it seems to have been generally assumed that the general meeting (of all shareholders) was the supreme organ of a company, and that the board of directors merely acted as an agent of the company subject to the control of the shareholders in general meeting. However, by 1906, the English Court of Appeal had made it clear in the decision of "Automatic Self-Cleansing Filter Syndicate Co Ltd v Cuninghame" [1906] 2 Ch 34 that the division of powers between the board and the shareholders in general meaning depended on the construction of the articles of association and that, where the powers of management were vested in the board, the general meeting could not interfere with their lawful exercise. The articles were held to constitute a contract by which the members had agreed that "the directors and the directors alone shall manage." The new approach did not secure immediate approval, but it was endorsed by the House of Lords in "Quin & Axtens v Salmon" [1909] AC 442 and has since received general acceptance. Under English law, successive versions of Table A have reinforced the norm that, unless the directors are acting contrary to the law or the provisions of the Articles, the powers of conducting the management and affairs of the company are vested in them. The modern doctrine was expressed in "John Shaw & Sons (Salford) Ltd v Shaw" [1935] 2 KB 113 by Greer LJ as follows: A company is an entity distinct alike from its shareholders and its directors. Some of its powers may, according to its articles, be exercised by directors, certain other powers may be reserved for the shareholders in general meeting. If powers of management are vested in the directors, they and they alone can exercise these powers. The only way in which the general body of shareholders can control the exercise of powers by the articles in the directors is by altering the articles, or, if opportunity arises under the articles, by refusing to re-elect the directors of whose actions they disapprove. They cannot themselves usurp the powers which by the articles are vested in the directors any more than the directors can usurp the powers vested by the articles in the general body of shareholders. It has been remarked that this development in the law was somewhat surprising at the time, as the relevant provisions in Table A (as it was then) seemed to contradict this approach rather than to endorse it. Demographics. The age of board members has become a notable trend across global stock exchanges, reflecting an increasingly older demographic in leadership roles. As of 2025, the average age of board members of companies listed on the Tokyo Stock Exchange stands at 62.2 years, while the United Kingdom reports an average of 60.1 years, Germany 59.3 years, South Korea 59 years, and Hong Kong 54.9 years. New York Stock Exchange companies exhibit the highest average age, at 62.9 years. In response to this aging trend, some firms, particularly those in entertainment and trend-driven sectors, have actively sought to appoint younger board members. An example is Sanrio, the Japanese company behind the Hello Kitty franchise, which reduced the average age of its board from 68 years in 2020 to 51 years by 2024, signaling a deliberate intergenerational shift in governance. Election and removal. In most legal systems, the appointment and removal of directors is voted upon by the shareholders in general meeting or through a proxy statement. For publicly traded companies in the U.S., the directors which are available to vote on are largely selected by either the board as a whole or a nominating committee. Although in 2002 the New York Stock Exchange and the NASDAQ required that nominating committees consist of independent directors as a condition of listing, nomination committees have historically received input from management in their selections even when the CEO does not have a position on the board. Shareholder nominations can only occur at the general meeting itself or through the prohibitively expensive process of mailing out ballots separately; in May 2009 the SEC proposed a new rule allowing shareholders meeting certain criteria to add nominees to the proxy statement. In practice for publicly traded companies, the managers (inside directors) who are purportedly accountable to the board of directors have historically played a major role in selecting and nominating the directors who are voted on by the shareholders, in which case more "gray outsider directors" (independent directors with conflicts of interest) are nominated and elected. In countries with co-determination, a fixed fraction of the board is elected by the corporation's workers. Directors may also leave office by resignation or death. In some legal systems, directors may also be removed by a resolution of the remaining directors (in some countries they may only do so "with cause"; in others the power is unrestricted). Some jurisdictions also permit the board of directors to appoint directors, either to fill a vacancy which arises on resignation or death, or as an addition to the existing directors. In practice, it can be quite difficult to remove a director by a resolution in general meeting. In many legal systems, the director has a right to receive special notice of any resolution to remove them; the company must often supply a copy of the proposal to the director, who is usually entitled to be heard by the meeting. The director may require the company to circulate any representations that they wish to make. Furthermore, the director's contract of service will usually entitle them to compensation if they are removed, and may often include a generous "golden parachute" which also acts as a deterrent to removal. A 2010 study examined how corporate shareholders voted in director elections in the United States. It found that directors received fewer votes from shareholders when their companies performed poorly, had excess CEO compensation, or had poor shareholder protection. Also, directors received fewer votes when they did not regularly attend board meetings or received negative recommendations from a proxy advisory firm. The study also shows that companies often improve their corporate governance by removing poison pills or classified boards and by reducing excessive CEO pay after their directors receive low shareholder support. Board accountability to shareholders is a recurring issue. In September 2010, "The New York Times" noted that several directors who had overseen companies which had failed in the 2008 financial crisis had found new positions as directors. The SEC sometimes imposes a ban (a "D&O bar") on serving on a board as part of its fraud cases, and one of these was upheld in 2013. Exercise of powers. The exercise by the board of directors of its powers usually occurs in board meetings. Most legal systems require sufficient notice to be given to all directors of these meetings, and that a quorum must be present before any business may be conducted. Usually, a meeting which is held without notice having been given is still valid if all of the directors attend, but it has been held that a failure to give notice may negate resolutions passed at a meeting, because the persuasive oratory of a minority of directors might have persuaded the majority to change their minds and vote otherwise. In most common law countries, the powers of the board are vested in the board as a whole, and not in the individual directors. However, in instances an individual director may still bind the company by their acts by virtue of their ostensible authority (see also: the rule in "Turquand's Case"). Duties. Because directors exercise control and management over the organization, but organizations are (in theory) run for the benefit of the shareholders, the law imposes strict duties on directors in relation to the exercise of their duties. The duties imposed on directors are fiduciary duties, similar to those that the law imposes on those in similar positions of trust: agents and trustees. The duties apply to each director separately, while the powers apply to the board jointly. Also, the duties are owed to the company itself, and not to any other entity. This does not mean that directors can never stand in a fiduciary relationship to the individual shareholders; they may well have such a duty in certain circumstances. "Proper purpose". Directors must exercise their powers for a proper purpose. While in many instances an improper purpose is readily evident, such as a director looking to enrich themselves or divert an investment opportunity to a relative, such breaches usually involve a breach of the director's duty to act in good faith. Greater difficulties arise where the director, while acting in good faith, is serving a purpose that is not regarded by the law as proper. The seminal authority in the United Kingdom in relation to what amounts to a proper purpose is the Supreme Court decision in Eclairs Group Ltd v JKX Oil & Gas plc (2015). The case concerned the powers of directors under the articles of association of the company to disenfranchise voting rights attached to shares for failure to properly comply with notice served on the shareholders. Prior to that case the leading authority was "Howard Smith Ltd v Ampol Ltd" [1974] AC 821. The case concerned the power of the directors to issue new shares. It was alleged that the directors had issued many new shares purely to deprive a particular shareholder of his voting majority. An argument that the power to issue shares could only be properly exercised to raise new capital was rejected as too narrow, and it was held that it would be a proper exercise of the director's powers to issue shares to a larger company to ensure the financial stability of the company, or as part of an agreement to exploit mineral rights owned by the company. If so, the mere fact that an incidental result (even if it was a desired consequence) was that a shareholder lost their majority, or a takeover bid was defeated, this would not itself make the share issue improper. But if the sole purpose was to destroy a voting majority, or block a takeover bid, that would be an improper purpose. Not all jurisdictions recognised the "proper purpose" duty as separate from the "good faith" duty however. "Unfettered discretion". Directors cannot, without the consent of the company, fetter their discretion in relation to the exercise of their powers, and cannot bind themselves to vote in a particular way at future board meetings. This is so even if there is no improper motive or purpose, and no personal advantage to the director. This does not mean, however, that the board cannot agree to the company entering into a contract which binds the company to a certain course, even if certain actions in that course will require further board approval. The company remains bound, but the directors retain the discretion to vote against taking the future actions (although that may involve a breach by the company of the contract that the board previously approved). "Conflict of duty and interest". As fiduciaries, the directors may not put themselves in a position where their interests and duties conflict with the duties that they owe to the company. The law takes the view that good faith must not only be done, but must be manifestly seen to be done, and zealously patrols the conduct of directors in this regard; and will not allow directors to escape liability by asserting that their decisions were in fact well founded. Traditionally, the law has divided conflicts of duty and interest into three sub-categories: 1. Transactions with the company. By definition, where a director enters into a transaction with a company, there is a conflict between the director's interest (to enrich themselves with the transaction) and their duty to the company (to ensure that the company gets as much as it can out of the transaction). In some places, this rule is so strictly enforced that, even where the conflict of interest or conflict of duty is purely hypothetical, the directors can be forced to disgorge all personal gains arising from it. In "Aberdeen Ry v Blaikie" (1854) 1 Macq HL 461 Lord Cranworth stated in his judgment that: "A corporate body can only act by agents, and it is, of course, the duty of those agents so to act as best to promote the interests of the corporation whose affairs they are conducting. Such agents have duties to discharge of a fiduciary nature towards their principal. And it is a rule of universal application that no one, having such duties to discharge, shall be allowed to enter into engagements in which he has, "or can have", a personal interest conflicting "or which possibly may conflict", with the interests of those whom he is bound to protect... So strictly is this principle adhered to that no question is allowed to be raised as to the fairness or unfairness of the contract entered into..." ("emphasis" added) However, in many jurisdictions the members of the company are permitted to ratify transactions which would otherwise fall foul of this principle. It is also largely accepted in most jurisdictions that this principle can be overridden in the company's constitution. In many countries, there is also a statutory duty to declare interests in relation to any transactions, and the director can be fined for failing to make disclosure. 2. Use of corporate property, opportunity, or information. Directors must not, without the informed consent of the company, use for their own profit the company's assets, opportunities, or information. This prohibition is much less flexible than the prohibition against the transactions with the company, and attempts to circumvent it using provisions in the articles have met with limited success. In "Regal (Hastings) Ltd v Gulliver" [1942] All ER 378 the House of Lords, in upholding what was regarded as a wholly unmeritorious claim by the shareholders, held that: "(i) that what the directors did was so related to the affairs of the company that it can properly be said to have been done in the course of their management and in the utilisation of their opportunities and special knowledge as directors; and (ii) that what they did resulted in profit to themselves." And accordingly, the directors were required to disgorge the profits that they made, and the shareholders received their windfall. The decision has been followed in several subsequent cases, and is now regarded as settled law. 3. Competing with the company. Directors cannot compete directly with the company without a conflict of interest arising. Similarly, they should not act as directors of competing companies, as their duties to each company would then conflict with each other. Common law duties of care and skill. Traditionally, the level of care and skill which has to be demonstrated by a director has been framed largely with reference to the non-executive director. In "Re City Equitable Fire Insurance Co" [1925] Ch 407, it was expressed in purely subjective terms, where the court held that: "a director need not exhibit in the performance of his duties a greater degree of skill than may reasonably be expected from a person of "his" knowledge and experience." ("emphasis" added) However, this decision was based firmly in the older notions (see above) that prevailed at the time as to the mode of corporate decision making, and effective control residing in the shareholders; if they elected and put up with an incompetent decision maker, they should not have recourse to complain. However, a more modern approach has since developed, and in "Dorchester Finance Co Ltd v Stebbing" [1989] BCLC 498 the court held that the rule in "Equitable Fire" related only to skill, and not to diligence. With respect to diligence, what was required was: "such care as an ordinary man might be expected to take on his own behalf." This was a dual subjective and objective test, and one deliberately pitched at a higher level. More recently, it has been suggested that both the tests of skill and diligence should be assessed objectively and subjectively; in the United Kingdom, the statutory provisions relating to directors' duties in the new Companies Act 2006 have been codified on this basis. Remedies for breach of duty. In most jurisdictions, the law provides for a variety of remedies in the event of a breach by the directors of their duties: Current trends. Historically, directors' duties have been owed almost exclusively to the company and its members, and the board was expected to exercise its powers for the financial benefit of the company. However, more recently there have been attempts to "soften" the position, and provide for more scope for directors to act as good corporate citizens. For example, in the United Kingdom, the Companies Act 2006 requires directors of companies "to promote the success of the company for the benefit of its members as a whole" and sets out the following six factors regarding a director's duty to promote success: This represents a considerable departure from the traditional notion that directors' duties are owed only to the company. Previously in the United Kingdom, under the Companies Act 1985, protections for non-member stakeholders were considerably more limited (see, for example, s.309 which permitted directors to take into account the interests of employees but which could only be enforced by the shareholders and not by the employees themselves). The changes have therefore been the subject of some criticism. Board of directors technology The adoption of technology that facilitates the meeting preparation and execution of directors continues to grow. Board directors are increasingly leveraging this technology to communicate and collaborate within a secure environment to access meeting materials, communicate with each other, and execute their governance responsibilities. This trend is particularly acute in the United States where a robust market of early adopters garnered acceptance of board software by organizations resulting in higher penetration of the board portal services in the region. The board and society. Most companies have weak mechanisms for bringing the voice of society into the board room. They rely on personalities who were not appointed for their understanding of societal issues. Often they give limited focus (both through time and financial resource) to issues of corporate responsibility and sustainability. A social board has society designed into its structure. It elevates the voice of society through specialist appointments to the board and mechanisms that empower innovation from within the organisation. Social boards align themselves with themes that are important to society. These may include measuring worker pay ratios, linking personal social and environmental objectives to remuneration, integrated reporting, fair tax and B-Corp certification. Social boards recognise that they are part of society and that they require more than a licence to operate to succeed. They balance short-term shareholder pressure against long-term value creation, managing the business for a plurality of stakeholders including employees, shareholders, supply chains and civil society. United States. Sarbanes–Oxley Act. The Sarbanes–Oxley Act of 2002 has introduced new standards of accountability on boards of U.S. companies or companies listed on U.S. stock exchanges. Under the act, directors risk large fines and prison sentences in the case of accounting crimes. Internal control is now the direct responsibility of directors. The vast majority of companies covered by the act have hired internal auditors to ensure that the company adheres to required standards of internal control. The internal auditors are required by law to report directly to an audit board, consisting of directors more than half of whom are outside directors, one of whom is a "financial expert". The law requires companies listed on the major stock exchanges (NYSE, NASDAQ) to have a majority of independent directors—directors who are not otherwise employed by the firm or in a business relationship with it. Size. According to the Corporate Library's study, the average size of publicly traded company's board is 9.2 members, and most boards range from 3 to 31 members. According to Investopedia, some analysts think the ideal size is seven. State law may specify a minimum number of directors, maximum number of directors, and qualifications for directors (e.g. whether board members must be individuals or may be business entities). Committees. While a board may have several committees, two—the compensation committee and audit committee—are critical and must be made up of at least three independent directors and no inside directors. Other common committees in boards are nominating and governance. Compensation. Directors of Fortune 500 companies received median pay of $234,000 in 2011. Directorship is a part-time job. A 2011 study by the National Association of Corporate Directors in the United States estimated that directors averaged 4.3 hours a week on board work. Surveys have indicated that about 20% of nonprofit foundations pay their board members, and 2% of American nonprofit organizations do. 80% of nonprofit organizations require board members to personally contribute to the organization. As of 2007, this percentage had increased in recent years. Criticism. According to John Gillespie, a former investment banker and co-author of a book critical of boards, "Far too much of their time has been for check-the-box and cover-your-behind activities rather than real monitoring of executives and providing strategic advice on behalf of shareholders". At the same time, scholars have found that individual directors have a large effect on major corporate initiatives such as mergers and acquisitions and cross-border investments. According to the International Institute of Management, the rise in investor dissatisfaction, class-action lawsuits and shareholder activism has resulted in more attention directed toward the board of directors and the lack of corporate governance accountability. Shareholders' complaints include issues such as excessive executive compensation and passive participation (lack of proper governance) of the board members. The journal paper cites examples of underperforming CEOs of Fortune 500 companies who took compensation and retirement package in hundreds of millions of dollars while the stock price of their companies lost billions of dollars. The research proposes institutionalizing a standardized board evaluation framework to help balance between the executive compensation and protecting the interest of shareholders The issue of gender representation on corporate boards of directors has been the subject of much criticism in recent years. Governments and corporations have responded with measures such as legislation mandating gender quotas and comply or explain systems to address the disproportionality of gender representation on corporate boards. A study of the French corporate elite has found that certain social classes are also disproportionately represented on boards, with those from the upper and, especially, upper-middle classes tending to dominate.
4823
46469005
https://en.wikipedia.org/wiki?curid=4823
Balkan Wars
The Balkan Wars were two conflicts that took place in the Balkan states in 1912 and 1913. In the First Balkan War, the four Balkan states of Greece, Serbia, Montenegro and Bulgaria declared war upon the Ottoman Empire and defeated it, in the process stripping the Ottomans of their European provinces, leaving only Eastern Thrace under Ottoman control. In the Second Balkan War, Bulgaria fought against the other four combatants of the first war. It also faced an attack from Romania from the north. The Ottoman Empire lost the bulk of its territory in Europe. Although not involved as a combatant, Austria-Hungary became relatively weaker as a much enlarged Serbia pushed for union of the South Slavic peoples. The war set the stage for the July crisis of 1914 and as a prelude to the First World War. By the early 20th century, Bulgaria, Greece, Montenegro and Serbia had achieved independence from the Ottoman Empire, but large elements of their ethnic populations remained under Ottoman rule. Over the course of the Macedonian Struggle these states fought for influence between themselves and the Ottoman government within Ottoman Macedonia, during which their governments came under the control of nationalists. In 1912, these countries united to formed the Balkan League. The First Balkan War began on 8 October 1912, when the League member states attacked the Ottoman Empire, and ended eight months later with the signing of the Treaty of London on 30 May 1913. The Second Balkan War began on 16 June 1913, when Bulgaria, dissatisfied with its loss of Macedonia, attacked its former Balkan League allies. The combined forces of Serbian and Greek armies, with their superior numbers repelled the Bulgarian offensive and counter-attacked Bulgaria by invading it from the west and the south. Romania, having taken no part in the conflict, had intact armies to strike with and invaded Bulgaria from the north in violation of a peace treaty between the two states. The Ottoman Empire also attacked Bulgaria and advanced in Thrace, regaining Adrianople. In the resulting Treaty of Bucharest, Bulgaria managed to regain most of the territories it had gained in the First Balkan War. However, it was forced to cede the ex-Ottoman south part of Dobruja province to Romania. The Balkan Wars were marked by ethnic cleansing, with all parties being responsible for grave atrocities against civilians, and inspired later atrocities including war crimes during the 1990s Yugoslav Wars. Background. The background to the wars lies in the incomplete emergence of nation-states on the European territory of the Ottoman Empire during the second half of the 19th century. Serbia had gained substantial territory during the Russo-Turkish War (1877–1878), while Greece acquired Thessaly in 1881 (although it lost a small area back to the Ottoman Empire in 1897) and Bulgaria (an autonomous principality since 1878) incorporated the formerly distinct province of Eastern Rumelia (1885). All three countries, as well as Montenegro, sought additional territories within the large Ottoman-ruled region known as Rumelia, comprising Eastern Rumelia, Albania, Macedonia, and Thrace. The First Balkan War had some main causes, which included: Policies of the Great Powers. Throughout the 19th century, the Great Powers shared different aims over the "Eastern Question" and the integrity of the Ottoman Empire. Russia wanted access to the "warm waters" of the Mediterranean from the Black Sea; so, it pursued a pan-Slavic foreign policy and therefore supported Bulgaria and Serbia. Britain wished to deny Russia access to the "warm waters" and supported the integrity of the Ottoman Empire, although it also supported a limited expansion of Greece as a backup plan in case integrity of the Ottoman Empire was not possible. France wished to strengthen its position in the region, especially in the Levant (today's Lebanon, Syria, and Israel). Habsburg-ruled Austria-Hungary wished for a continuation of the existence of the Ottoman Empire, since both were troubled multinational entities and thus the collapse of the one might weaken the other. The Habsburgs also saw a strong Ottoman presence in the area as a counterweight to the Serbian nationalistic call to their own Serb subjects in Bosnia, Vojvodina and other parts of the empire. Italy's primary aim at the time seems to have been the denial of access to the Adriatic Sea to another major sea power. The German Empire, in turn, under the "Drang nach Osten" policy, aspired to turn the Ottoman Empire into its own de facto colony, and thus supported its integrity. In the late 19th and early 20th century, Bulgaria and Greece contended for Ottoman Macedonia and Thrace. Ethnic Greeks sought the forced "Hellenization" of ethnic Bulgars, who sought "Bulgarization" of Greeks (Rise of nationalism). Both nations sent armed irregulars into Ottoman territory to protect and assist their ethnic kindred. From 1904, there was low-intensity warfare in Macedonia between the Greek and Bulgarian bands and the Ottoman army (the Struggle for Macedonia). After the Young Turk revolution of July 1908, the situation changed drastically. Young Turk Revolution. The 1908 Young Turk Revolution saw the reinstatement of constitutional monarchy in the Ottoman Empire and the start of the Second Constitutional Era. When the revolt broke out, it was supported by intellectuals, the army, and almost all the ethnic minorities of the Empire. It forced Sultan Abdul Hamid II to re-adopt the defunct Ottoman constitution of 1876 and parliament. Hopes were raised among the Balkan ethnicities of reforms and autonomy. Elections were held to form a representative, multi-ethnic, Ottoman parliament. However, following the Sultan's failed counter-coup of 1909, the liberal element of the Young Turks was sidelined and the nationalist element became dominant. In October 1908, Austria-Hungary seized the opportunity of the Ottoman political upheaval to annex the "de jure" Ottoman province of Bosnia and Herzegovina, which it had occupied since 1878 (see "Bosnian Crisis"). Bulgaria declared independence as it had done in 1878, but this time the independence was internationally recognized. The Greeks of the autonomous Cretan State proclaimed unification with Greece, though the opposition of the Great Powers prevented the latter action from taking practical effect. Reaction in the Balkan states. Serbia was frustrated in the north by Austria-Hungary's incorporation of Bosnia. In March 1909, Serbia was forced to accept the annexation and restrain anti-Habsburg agitation by Serbian nationalists. Instead, the Serbian government (PM: Nikola Pašić) looked to formerly Serb territories in the south, notably "Old Serbia" (the Sanjak of Novi Pazar and the province of Kosovo). On 15 August 1909, the Military League, a group of Greek officers, launched a coup. The Military League sought the creation of a new political system and thus summoned the Cretan politician Eleftherios Venizelos to Athens as its political advisor. Venizelos persuaded King George I to revise the constitution and asked the League to disband in favor of a National Assembly. In March 1910, the Military League dissolved itself. Bulgaria, which had secured Ottoman recognition of her independence in April 1909 and enjoyed the friendship of Russia, also looked to annex districts of Ottoman Thrace and Macedonia. In August 1910, Montenegro followed Bulgaria's precedent by becoming a kingdom. Pre-War treaties. Following the Italian victory in the Italo-Turkish War of 1911–1912, the severity of the Ottomanizing policy of the Young Turkish regime and a series of three revolts in Ottoman held Albania, the Young Turks fell from power after a coup. The Christian Balkan countries were forced to take action and saw this as an opportunity to promote their national agenda by expanding in the territories of the falling empire and liberating their enslaved co-patriots. In order to achieve that, a wide net of treaties was constructed and an alliance was formed. The negotiation among the Balkan states' governments started in the latter part of 1911 and was all conducted in secret. The treaties and military conventions were published in French translations after the Balkan Wars on 24–26 of November in Le Matin, Paris, France In April 1911, Greek PM Eleutherios Venizelos’ attempt to reach an agreement with the Bulgarian PM and form a defensive alliance against the Ottoman Empire was fruitless, because of the doubts the Bulgarians held on the strength of the Greek Army. Later that year, in December 1911, Bulgaria and Serbia agreed to start negotiations in forming an alliance under the tight inspection of Russia. The treaty between Serbia and Bulgaria was signed on 29 of February/13 of March 1912. Serbia sought expansion to "Old Serbia" and as Milan Milovanovich noted in 1909 to the Bulgarian counterpart, "As long as we are not allied with you, our influence over the Croats and Slovens will be insignificant". On the other side, Bulgaria wanted the autonomy of Macedonia region under the influence of the two countries. The then Bulgarian Minister of Foreign Affairs General Stefan Paprikov stated in 1909 that, "It will be clear that if not today then tomorrow, the most important issue will again be the Macedonian Question. And this question, whatever happens, cannot be decided without more or less direct participation of the Balkan States". Last but not least, they noted down the divisions that should be made of the Ottoman territories after a victorious outcome of the war. Bulgaria would gain all the territories eastern of Rodopi Mountains and River Strimona, while Serbia would annex the territories northern and western of Mount Skardu. The alliance pact between Greece and Bulgaria was finally signed on 16/29 of May 1912, without stipulating any specific division of Ottoman territories. In summer 1912, Greece proceeded on making "gentlemen's agreements" with Serbia and Montenegro. Despite the fact that a draft of the alliance pact with Serbia was submitted on 22 of October, a formal pact was never signed due to the outbreak of the war. As a result, Greece did not have any territorial or other commitments, other than the common cause to fight the Ottoman Empire. In April 1912 Montenegro and Bulgaria reached an agreement including financial aid to Montenegro in case of war with the Ottoman Empire. A gentlemen's agreement with Greece was reached soon after, as mentioned before. By the end of September a political and military alliance between Montenegro and Serbia was achieved. By the end of September 1912, Bulgaria had formal-written alliances with Serbia, Greece, and Montenegro. A formal alliance was also signed between Serbia and Montenegro, while Greco-Montenegrin and Greco-Serbian agreements were basically oral "gentlemen's agreements". All these completed the formation of the Balkan League. Balkan League. At that time, the Balkan states had been able to maintain armies that were both numerous, in relation to each country's population, and eager to act, being inspired by the idea that they would free enslaved parts of their homeland. The Bulgarian Army was the leading army of the coalition. It was a well-trained and fully equipped army, capable of facing the Imperial Army. It was suggested that the bulk of the Bulgarian Army would be in the Thracian front, as it was expected that the front near the Ottoman Capital would be the most crucial one. The Serbian Army would act on the Macedonian front, as the Greek Army were thought to be powerless, but Greece was needed in the Balkan League for its navy and its capability to dominate the Aegean Sea, cutting off the Ottoman Armies from reinforcements. On 13 (O.S.)/26 of September 1912, the Ottoman mobilization in Thrace forced Serbia and Bulgaria to act and order their own mobilization. On 17/30 of September Greece also ordered mobilization. On 25 of September/8 of October, Montenegro declared war on the Ottoman Empire, after negotiations failed regarding the border status. On 30 of September/13 of October, the ambassadors of Serbia, Bulgaria, and Greece delivered the common ultimatum to the Ottoman government, which was immediately rejected. The Empire withdrew its ambassadors from Sofia, Belgrade, and Athens, while the Bulgarian, Serbian and Greek diplomats left the Ottoman capital delivering the war declaration on 4/17 of October 1912. First Balkan War. The three Slavic allies (Bulgaria, Serbia, and Montenegro) had laid out extensive plans to coordinate their war efforts, in continuation of their secret prewar settlements and under close Russian supervision (Greece was not included). Serbia and Montenegro would attack in the theater of Sanjak, Bulgaria, and Serbia in Macedonia and Thrace. The Ottoman Empire's situation was difficult. Its population of about 26 million people provided a massive pool of manpower, but three-quarters of the population lived in the Asian part of the Empire. Reinforcements had to come from Asia mainly by sea, which depended on the result of battles between the Turkish and Greek navies in the Aegean. With the outbreak of the war, the Ottoman Empire activated three Army HQs: the Thracian HQ in Constantinople, the Western HQ in Salonika, and the Vardar HQ in Skopje, against the Bulgarians, the Greeks and the Serbians respectively. Most of their available forces were allocated to these fronts. Smaller independent units were allocated elsewhere, mostly around heavily fortified cities. Montenegro was the first to declare war on 8 October (25 September O.S.). Its main thrust was towards Shkodra, with secondary operations in the Novi Pazar area. The rest of the Allies, after giving a common ultimatum, declared war a week later. Bulgaria attacked towards Eastern Thrace, being stopped only at the outskirts of Constantinople at the Çatalca line and the isthmus of the Gallipoli peninsula, while secondary forces captured Western Thrace and Eastern Macedonia. Serbia attacked south towards Skopje and Monastir and then turned west to present-day Albania, reaching the Adriatic, while a second Army captured Kosovo and linked with the Montenegrin forces. Greece's main forces attacked from Thessaly into Macedonia through the Sarantaporo strait. On 7 November, in response to an Ottoman initiative, they entered into negotiations for the surrender of Thessaloniki. With the Greeks already there, and the Bulgarian 7th Rila Division moving swiftly from the north towards Thessaloniki, Hassan Tahsin Pasha considered his position to be hopeless. The Greeks offered more attractive terms than the Bulgarians did. On 8 November, Tahsin Pasha agreed to terms and 26,000 Ottoman troops passed over into Greek captivity. Before the Greeks entered the city, a German warship whisked the former sultan Abdul Hamid II out of Thessaloniki to continue his exile, across the Bosporus from Constantinople. With their army in Thessaloniki, the Greeks took new positions to the east and northeast, including Nigrita. On 12 November (on 26 October 1912, O.S.) Greece expanded its occupied area and teamed up with the Serbian army to the northwest, while its main forces turned east towards Kavala, reaching the Bulgarians. Another Greek army attacked into Epirus towards Ioannina. On the naval front, the Ottoman fleet twice exited the Dardanelles and was twice defeated by the Greek Navy, in the battles of Elli and Lemnos. Greek dominance on the Aegean Sea made it impossible for the Ottomans to transfer the planned troops from the Middle East to the Thracian (against the Bulgarian) and to the Macedonian (against the Greeks and Serbians) fronts. According to E.J. Erickson the Greek Navy also played a crucial, albeit indirect role, in the Thracian campaign by neutralizing no less than three Thracian Corps (see First Balkan War, the Bulgarian theater of operations), a significant portion of the Ottoman Army there, in the all-important opening round of the war. After the defeat of the Ottoman fleet, the Greek Navy was also free to occupy the islands of the Aegean. General Nikola Ivanov identified the activity of the Greek Navy as the chief factor in the general success of the allies. In January, after a successful coup by young army officers, the Ottoman Empire decided to continue the war. After a failed Ottoman counter-attack in the Western-Thracian front, Bulgarian forces, with the help of the Serbian Army, managed to conquer Adrianople, while Greek forces managed to take Ioannina after defeating the Ottomans in the Battle of Bizani. In the joint Serbian-Montenegrin theater of operation, the Montenegrin army besieged and captured the Shkodra, ending the Ottoman presence in Europe west of the Çatalca line after nearly 500 years. The war ended officially with the Treaty of London on 30(17) May 1913. Prelude to the Second Balkan War. After pressure from the Great Powers towards Greece and Serbia, who had postponed signing in order to fortify their defensive positions, the signing of the Treaty of London took place on 30 May 1913. With this treaty, the war between the Balkan Allies and the Ottoman Empire came to an end. From now on, the Great Powers had the right of decision on the territorial adjustments that had to be made, which even led to the creation of an independent Albania. Every Aegean island belonging to the Ottoman Empire, with the exception of Imbros and Tenedos, was handed over to the Greeks, including the island of Crete. Furthermore, all European territory of the Ottoman Empire west of the (Enez-Kıyıköy) line, was ceded to the Balkan League, but the division of the territory among the League was not to be decided by the Treaty itself. This event led to the formation of two ‘de facto’ military occupation zones on Macedonian territory, as Greece and Serbia tried to create a common border. In turn, Bulgarians were furious about Serbia's refusal to honour its commitments under the 1912 Serbo-Bulgarian Treaty, which had split the geographic region of Macedonia into two zones, one contested between Serbia and Bulgaria, and another one, recognised by the two countries as Bulgarian by rights. Before the war, Serbia had relinquished its claim to the territory east of the Ohrid-Kriva Palanka line in favour of Bulgaria (the "‘Uncontested Zone’"), while the future of some 11,000 square km2 of territory, forming the northwestern corner of geographic region of Macedonia (the "‘Contested Zone’"), was to be decided by the Russian Emperor, who was Senior Arbitrary and Guarantor of the alliance. Assured by the clauses of the Treaty that it would receive what it considered its fair share of Macedonia, Bulgaria sent almost all of its troops to the Thracian front, which was expected to, and eventually did indeed, decide the outcome of the war. At the same time, Serbia pushed into Kosovo and northern Macedonia. As the establishment of an independent Albanian state, brokered by Italy and Austria-Hungary, deprived the Serbs of their much-coveted Adriatic port, they demanded not only the entire "Contested Zone", but also all of the "Uncontested" one they had occupied. Bulgarian efforts to appeal to the Russian Emperor, quoting, for example, the clauses of the Treaty, the material difference between Serbian (29,698) and Bulgarian casualties (87,926) or the surrender of the Bulgarian City of Silistra to Romania as compensation for its continued neutrality proved futile. Russian Foreign Minister Sergey Sazonov instead kept encouraging Bulgaria to accede to an ever-increasing list of Serbian demands. In the end, Bulgaria's overreliance on the Russian Empire, a power which had anathemised the Unification of Bulgaria, invited the Ottoman Sultan to reconquer Eastern Rumelia and organised a coup against the Bulgarian Prince only three decades prior and which had watched Ferdinand's charge towards Istanbul with ill-disguised alarm due to its own long-standing aspirations towards the Turkish Straits, Bulgaria's unwillingness to reach a compromise with Greece, despite several attempts made by Greek Prime Minister Venizelos, and Serbian insistence to keep all conquered territory paved the way to another conflict. On 1 May 1913, Greece and Serbia settled their differences and signed a military alliance directed against Bulgaria. On the night of 29 June 1913, the Bulgarian army made an ill-advised attempt to gain an advantage in the negotiations by pushing out Serbian and Greek forces out of several disputed territories without a battle plan or declaration of war, naively thinking that this would be regarded as a mere altercation. Instead, the action gave Serbia and Greece casus belli and kicked off the Second Balkan War. Second Balkan War. Though the Balkan allies had fought together against the common enemy, that was not enough to overcome their mutual rivalries. In the original document for the Balkans league, Serbia promised Bulgaria most of Macedonia. But before the first war had come to an end, Serbia (in violation of the previous agreement) and Greece revealed their plan to keep possession of the territories that their forces had occupied. This act prompted the tsar of Bulgaria to invade his allies. The Second Balkan War broke out on 29 (16) June 1913, when Bulgaria attacked its erstwhile allies in the First Balkan War, Serbia and Greece, while Montenegro and the Ottoman Empire intervened later against Bulgaria, with Romania attacking Bulgaria from the north in violation of a peace treaty. When the Greek army had entered Thessaloniki in the First Balkan War ahead of the Bulgarian 7th division by only a day, they were asked to allow a Bulgarian battalion to enter the city. Greece accepted in exchange for allowing a Greek unit to enter the city of Serres. The Bulgarian unit that entered Thessaloniki turned out to be an 18,000-strong division instead of the battalion, which caused concern among the Greeks, who viewed it as a Bulgarian attempt to establish a condominium over the city. In the event, due to the urgently needed reinforcements in the Thracian front, Bulgarian Headquarters was soon forced to remove its troops from the city (while the Greeks agreed by mutual treaty to remove their units based in Serres) and transport them to Dedeağaç (modern Alexandroupolis), but it left behind a battalion that started fortifying its positions. Greece had also allowed the Bulgarians to control the stretch of the Thessaloniki-Constantinople railroad that lay in Greek-occupied territory since Bulgaria controlled the largest part of this railroad towards Thrace. After the end of the operations in Thrace, and confirming Greek concerns, Bulgaria was not satisfied with the territory it controlled in Macedonia and immediately asked Greece to relinquish its control over Thessaloniki and the land north of Pieria, effectively handing over all of Greek Macedonia. These demands, with the Bulgarian refusal to demobilize its army after the Treaty of London had ended the common war against the Ottomans, alarmed Greece, which decided to also keep its army mobilized. A month after the Second Balkan War started, the Bulgarian community of Thessaloniki no longer existed, as hundreds of long-time Bulgarian locals were arrested. Thirteen hundred Bulgarian soldiers and about five hundred komitadjis were also arrested and transferred to Greek prisons. In November 1913, the Bulgarians were forced to admit their defeat, as the Greeks received international recognition on their claim of Thessaloniki. Similarly, in modern North Macedonia, the tension between Serbia and Bulgaria due to the latter's aspirations over Vardar Macedonia generated many incidents between their respective armies, prompting Serbia to keep its army mobilized. Serbia and Greece proposed that each of the three countries reduce its army by a quarter, as the first step towards a peaceful solution, but Bulgaria rejected it. Seeing the omens, Greece and Serbia started a series of negotiations and signed a treaty on 1 June(19 May) 1913. With this treaty, a mutual border was agreed between the two countries, together with an agreement for mutual military and diplomatic support in case of a Bulgarian or/and Austro-Hungarian attack. Tsar Nicholas II of Russia, being well informed, tried to stop the upcoming conflict on 8 June, by sending an identical personal message to the Kings of Bulgaria and Serbia, offering to act as arbitrator according to the provisions of the 1912 Serbo-Bulgarian treaty. But Bulgaria, by making the acceptance of Russian arbitration conditional, in effect denied any discussion, causing Russia to repudiate its alliance with Bulgaria (see Russo-Bulgarian military convention signed 31 May 1902). The Serbs and the Greeks had a military advantage on the eve of the war because their armies confronted comparatively weak Ottoman forces in the First Balkan War and suffered relatively light casualties, while the Bulgarians were involved in heavy fighting in Thrace. The Serbs and Greeks had time to fortify their positions in Macedonia. The Bulgarians also held some advantages, controlling internal communication and supply lines. On 29(16) June 1913, General Savov, under direct orders of Tsar Ferdinand I, issued attack orders against both Greece and Serbia without consulting the Bulgarian government and without an official declaration of war. During the night of 30(17) June 1913, they attacked the Serbian army at Bregalnica river and then the Greek army in Nigrita. The Serbian army resisted the sudden night attack, while most of the soldiers did not even know who they were fighting with, as Bulgarian camps were located next to Serbs and were considered allies. Montenegro's forces were just a few kilometers away and also rushed to the battle. The Bulgarian attack was halted. The Greek army was also successful. It retreated according to plan for two days while Thessaloniki was cleared of the remaining Bulgarian regiment. Then, the Greek army counterattacked and defeated the Bulgarians at Kilkis (Kukush), after which the mostly Bulgarian town was plundered and burnt and part of its mostly Bulgarian population massacred by the Greek army. Following the capture of Kilkis, the Greek army's pace was not quick enough to prevent the retaliatory destruction of Nigrita, Serres, and Doxato and massacres of non-combatant Greek inhabitants at Sidirokastro and Doxato by the Bulgarian army. The Greek committed further war crimes against the Bulgarian population during it advance - in total about 160 Bulgarian villages were destroyed and most of their population expelled. with multiple additional massacres of the Bulgarian civilian population. The Greek army then divided its forces and advanced in two directions. Part proceeded east and occupied Western Thrace. The rest of the Greek army advanced up to the Struma River valley, defeating the Bulgarian army in the battles of Doiran and Mt. Beles, and continued its advance to the north towards Sofia. In the Kresna straits, the Greeks were ambushed by the Bulgarian 2nd and 1st Armies, newly arrived from the Serbian front, that had already taken defensive positions there following the Bulgarian victory at Kalimanci. By 30 July, the Greek army was outnumbered by the counter-attacking Bulgarian army, which attempted to encircle the Greeks in a Cannae-type battle, by applying pressure on their flanks. The Greek army was exhausted and faced logistical difficulties. The battle was continued for 11 days, between 29 July and 9 August over 20 km of a maze of forests and mountains with no conclusion. The Greek king, seeing that the units he fought were from the Serbian front, tried to convince the Serbs to renew their attack, as the front ahead of them was now thinner, but the Serbs declined. By then, news came of the Romanian advance toward Sofia and its imminent fall. Facing the danger of encirclement, Constantine realized that his army could no longer continue hostilities. Thus, he agreed to Eleftherios Venizelos' proposal and accepted the Bulgarian request for an armistice as had been communicated through Romania. Romania had raised an army and declared war on Bulgaria on 10 July (27 June) as it had from 28 (15) June officially warned Bulgaria that it would not remain neutral in a new Balkan war, due to Bulgaria's refusal to cede the fortress of Silistra as promised before the First Balkan War in exchange for Romanian neutrality. Its forces encountered little resistance and, by the time the Greeks accepted the Bulgarian request for an armistice, they had reached Vrazhdebna, from the center of Sofia. Seeing the military position of the Bulgarian army, the Ottomans decided to intervene. They attacked, and, finding no opposition, managed to win back all of their lands which had been officially ceded to Bulgaria as a part of the Sofia Conference in 1914, i.e. Thrace with its fortified city of Adrianople, regaining an area in Europe which was only slightly larger than the present-day European territory of the Republic of Turkey. Reactions among the Great Powers during the wars. The developments that led to the First Balkan War did not go unnoticed by the Great Powers. Although there was an official consensus between the European Powers over the territorial integrity of the Ottoman Empire, which led to a stern warning to the Balkan states, unofficially each of them took a different diplomatic approach due to their conflicting interests in the area. As a result, any possible preventive effect of the common official warning was cancelled by the mixed unofficial signals, and failed to prevent or to stop the war: The Second Balkan war was a catastrophic blow to Russian policies in the Balkans, which for centuries had focused on access to the "warm seas". First, it marked the end of the Balkan League, a vital arm of the Russian system of defense against Austria-Hungary. Second, the clearly pro-Serbian position Russia had been forced to take in the conflict, mainly due to the disagreements over land partitioning between Serbia and Bulgaria, caused a permanent break-up between the two countries. Accordingly, Bulgaria reverted its policy to one closer to the Central Powers' understanding over an anti-Serbian front, due to its new national aspirations, now expressed mainly against Serbia. As a result, Serbia was isolated militarily against its rival Austria-Hungary, a development that eventually doomed Serbia in the coming war a year later. Most damaging, the new situation effectively trapped Russian foreign policy: After 1913, Russia could not afford to lose its last ally in this crucial area and thus had no alternatives but to unconditionally support Serbia when the crisis between Serbia and Austria escalated in 1914. This was a position that inevitably drew Russia into an unwelcome World War with devastating results since it was less prepared (both militarily and socially) for that event than any other Great Power. Austria-Hungary took alarm at the great increase in Serbia's territory at the expense of its national aspirations in the region, as well as Serbia's rising status, especially to Austria-Hungary's Slavic populations. This concern was shared by Germany, which saw Serbia as a satellite of Russia. These concerns contributed significantly to the two Central Powers' willingness to go to war against Serbia. This meant that when a Serbian backed organization assassinated Archduke Franz Ferdinand of Austria, the reform-minded heir to the Austro-Hungarian throne, causing the 1914 July Crisis, the conflict quickly escalated and resulted in the First World War. Epilogue. The Treaty of Bucharest. The epilogue to this nine-month pan-Balkan war was drawn mostly by the treaty of Bucharest, 10 August 1913. Delegates of Greece, Serbia, Montenegro, and Bulgaria, hosted by the deputy of Romania arrived in Bucharest to settle negotiations. Ottoman's request to participate was rejected, on the basis that the talks were to deal with matters strictly among the Balkan allies. The Great Powers maintained a very influential presence, but they did not dominate the proceedings. The Treaty partitioned Macedonia, made changes to the Balkan borders and established the independent state of Albania. Serbia gained the territory of north-east Macedonia, settled the eastern borders with Bulgaria and gained the eastern half of the Sanjak of Novi-Bazar, doubling its size. Montenegro gained the western half of the Sanjak of Novi-Bazar and secured the borders with Serbia. Greece more than doubled its size by gaining southern Epirus, the biggest part of southern Macedonia, including the city-port of Kavala in its eastern border. The Aegean Islands were annexed by the Greek Kingdom, apart from the Dodecanese, and the Cretan unification was completed and formalized. Romania annexed the southern part of Dobruja province. Bulgaria, even though defeated, managed to hold some territorial gains from the First Balkan War. Bulgaria embraced a portion of Macedonia, including the town of Strumnitza, and western Thrace with a 70-mile Aegean coastline including the port-town of Alexandroupolis. The Final Treaties. The Bulgarian delegates then met the Ottomans for negotiations in Constantinople. Bulgaria's hope to regain lost territories in Eastern Thrace, where the bulk of Bulgarian forces had struggled to conquer and many died, was dashed as the Turks retained the lands they had regained in the counter-attack. The straight line of Ainos-Midia was not accepted for the eastern Border; the regions of Lozengrad, Lyule Burgas-Buni Hisar, and Adrianople reverted to the Ottomans. After this Treaty of Constantinople, 30 September 1913, Bulgaria sought an alliance with the Ottoman Empire against Greece and Serbia (in support of their claims to Macedonia). This was followed by the Treaty of Athens, 14 November 1913, between the Turks and Greeks, concluding the conflict between them. However, the status of the Aegean Islands, under Greek control, was left in question. Especially the islands of Imvros and Tenedos strategically positioned near the Dardanelles Straights. Even after signing this treaty, relations between the two countries remained very bad, and war almost broke out in spring 1914. Finally, the second Treaty in Constantinople re-established the relations between Serbia and the Ottoman Empire, concluding officially the Balkan Wars. Montenegro never signed a pact with the Turks. The Balkan Wars brought to an end Ottoman rule of the Balkan Peninsula, except for eastern Thrace and Constantinople (now Istanbul). The Young Turk regime was unable to reverse their Empire's decline, but remained in power, establishing a dictatorship in June 1913. Legacy. Citizens of Turkey regard the Balkan Wars as a major disaster ("Balkan harbi faciası") in the nation's history. The Ottoman Empire lost all its European territories west of the River Maritsa as a result of the two Balkan Wars, which thus delineated present-day Turkey's western border. By 1923, only 38% of the Muslim population of 1912 still lived in the Balkans and majority of Balkan Turks had been killed or expelled. The unexpected fall and sudden relinquishing of Turkish-dominated European territories created a traumatic event amongst many Turks that triggered the ultimate collapse of the empire itself within five years. Paul Mojzes has called the Balkan Wars an "unrecognized genocide". Nazım Pasha, Chief of Staff of the Ottoman Army, was held responsible for the failure and was assassinated on 23 January 1913 during the 1913 Ottoman coup d'état. Most Greeks regard the Balkan Wars as a period of epic achievements. They managed to liberate and gain territories that had been inhabited by Greeks since ancient times and doubled the size of the Greek Kingdom, capturing major cities such as Thessaloniki and Ioannina that had been under Ottoman rule for almost half a millennium. The Greek Army, small and ill-equipped compared to the superior Ottoman but also Bulgarian and Serbian armies, won very important battles. That made Greece a viable pawn in the Great Powers' chess play. Two great personalities rose in the Greek political arena, Prime Minister Eleftherios Venizelos, the leading mind behind the Greek foreign policy, and Crown Prince, and later King, Konstantinos I, the Major General of the Greek Army.
4825
46700810
https://en.wikipedia.org/wiki?curid=4825
BeBox
The BeBox is a discontinued personal computer from Be Inc., running the company's operating system, later named BeOS. It has two PowerPC CPUs, its I/O board has a custom "GeekPort", and the front bezel has "Blinkenlights". The BeBox debuted in October 1995 with dual PowerPC 603 at 66 MHz. The processors were upgraded to 133 MHz in August 1996 (BeBox Dual603e-133). Production was halted in January 1997, following the port of BeOS to the Macintosh, for the company to concentrate on software. Be sold around 1,000 66 MHz BeBoxes and 800 133 MHz BeBoxes. CPU configuration. Production models use two 66 MHz PowerPC 603 processors or two 133 MHz PowerPC 603e processors to power the BeBox. Prototypes having dual 200 MHz CPUs or four CPUs exist, but were never publicly available. Main board. The main board is in a standard AT format commonly found on PC. It used standard PC components to make it as inexpensive as possible. I/O board. The I/O board offers four serial ports (9-pin D-sub), a PS/2 mouse port, and two joystick ports (15-pin D-sub). There are four DIN MIDI ports (two in, two out), two stereo pairs of RCA connectors audio line-level input and output, and a pair of 3.5 mm stereo phone jacks for microphone input and headphone output. There are also internal audio connectors: 5-pin strip for the audio CD line-level playback, and two 4-pin strips for microphone input and headphone output. The audio is produced with a 16-bit DAC stereo sound system capable of 48 kHz and 44.1 kHz. For the more unusual uses, there are three 4-pin mini-DIN infrared (IR) I/O ports. GeekPort. An experimental-electronic-development oriented port, backed by three fuses on the mainboard, the 37-pin D-sub "GeekPort" provides digital and analog I/O and DC power on the ISA bus: "Blinkenlights". Two yellow/green vertical LED arrays, dubbed the "blinkenlights", are built into the front bezel to illustrate the CPU load. The bottommost LED on the right side indicates hard disk activity. Market. Be called the BeBox: "the first true real-time, portable, object-oriented system that features multiple PowerPC processors, true preemptive multitasking, an integrated database, fast I/O, and a wide range of expansion options — all at an extremely aggressive price that is well below that of any competitive offering." BeBox creator Jean-Louis Gassée did not see the BeBox as a general consumer device, warning that "Before we let you use the BeBox, we believe you must have some aptitude toward programming the standard language is C++." Prototype. As of 1993, prototypes (at the time called Be Machine) had two 25 MHz AT&T Hobbit processors and three AT&T 9308S DSPs. In 2009, a rare prototype of the BeBox with Hobbit processors was sold at an auction. Linux. Craftworks Solutions developed Be_Linux for the BeBox. Craftworks and Be Inc. announced in 1996 that they would work together to bring Be_Linux to the BeBox.
4827
1295421680
https://en.wikipedia.org/wiki?curid=4827
Biomedical engineering
Biomedical engineering (BME) or medical engineering is the application of engineering principles and design concepts to medicine and biology for healthcare applications (e.g., diagnostic or therapeutic purposes). BME also integrates the logical sciences to advance health care treatment, including diagnosis, monitoring, and therapy. Also included under the scope of a biomedical engineer is the management of current medical equipment in hospitals while adhering to relevant industry standards. This involves procurement, routine testing, preventive maintenance, and making equipment recommendations, a role also known as a Biomedical Equipment Technician (BMET) or as a clinical engineer. Biomedical engineering has recently emerged as its own field of study, as compared to many other engineering fields. Such an evolution is common as a new field transitions from being an interdisciplinary specialization among already-established fields to being considered a field in itself. Much of the work in biomedical engineering consists of research and development, spanning a broad array of subfields (see below). Prominent biomedical engineering applications include the development of biocompatible prostheses, various diagnostic and therapeutic medical devices ranging from clinical equipment to micro-implants, imaging technologies such as MRI and EKG/ECG, regenerative tissue growth, and the development of pharmaceutical drugs including biopharmaceuticals. Subfields and related fields. Bioinformatics. Bioinformatics is an interdisciplinary field that develops methods and software tools for understanding biological data. As an interdisciplinary field of science, bioinformatics combines computer science, statistics, mathematics, and engineering to analyze and interpret biological data. Bioinformatics is considered both an umbrella term for the body of biological studies that use computer programming as part of their methodology, as well as a reference to specific analysis "pipelines" that are repeatedly used, particularly in the field of genomics. Common uses of bioinformatics include the identification of candidate genes and nucleotides (SNPs). Often, such identification is made with the aim of better understanding the genetic basis of disease, unique adaptations, desirable properties (esp. in agricultural species), or differences between populations. In a less formal way, bioinformatics also tries to understand the organizational principles within nucleic acid and protein sequences. Biomechanics. Biomechanics is the study of the structure and function of the mechanical aspects of biological systems, at any level from whole organisms to organs, cells and cell organelles, using the methods of mechanics. Biomaterials. A biomaterial is any matter, surface, or construct that interacts with living systems. As a science, biomaterials is about fifty years old. The study of biomaterials is called biomaterials science or biomaterials engineering. It has experienced steady and strong growth over its history, with many companies investing large amounts of money into the development of new products. Biomaterials science encompasses elements of medicine, biology, chemistry, tissue engineering and materials science. Biomedical optics. Biomedical optics combines the principles of physics, engineering, and biology to study the interaction of biological tissue and light, and how this can be exploited for sensing, imaging, and treatment. It has a wide range of applications, including optical imaging, microscopy, ophthalmoscopy, spectroscopy, and therapy. Examples of biomedical optics techniques and technologies include "optical coherence tomography" (OCT), "fluorescence microscopy", "confocal microscopy", and "photodynamic therapy" (PDT). OCT, for example, uses light to create high-resolution, three-dimensional images of internal structures, such as the "retina" in the eye or the "coronary arteries" in the heart. Fluorescence microscopy involves labeling specific molecules with fluorescent dyes and visualizing them using light, providing insights into biological processes and disease mechanisms. More recently, "adaptive optics" is helping imaging by correcting aberrations in biological tissue, enabling higher resolution imaging and improved accuracy in procedures such as laser surgery and retinal imaging. Tissue engineering. Tissue engineering, like genetic engineering (see below), is a major segment of biotechnology – which overlaps significantly with BME. One of the goals of tissue engineering is to create artificial organs (via biological material) such as kidneys, livers, for patients that need organ transplants. Biomedical engineers are currently researching methods of creating such organs. Researchers have grown solid jawbones and tracheas from human stem cells towards this end. Several artificial urinary bladders have been grown in laboratories and transplanted successfully into human patients. Bioartificial organs, which use both synthetic and biological component, are also a focus area in research, such as with hepatic assist devices that use liver cells within an artificial bioreactor construct. Genetic engineering. Genetic engineering, recombinant DNA technology, genetic modification/manipulation (GM) and gene splicing are terms that apply to the direct manipulation of an organism's genes. Unlike traditional breeding, an indirect method of genetic manipulation, genetic engineering utilizes modern tools such as molecular cloning and transformation to directly alter the structure and characteristics of target genes. Genetic engineering techniques have found success in numerous applications. Some examples include the improvement of crop technology ("not a medical application", but see biological systems engineering), the manufacture of synthetic human insulin through the use of modified bacteria, the manufacture of erythropoietin in hamster ovary cells, and the production of new types of experimental mice such as the oncomouse (cancer mouse) for research. Neural engineering. Neural engineering (also known as neuroengineering) is a discipline that uses engineering techniques to understand, repair, replace, or enhance neural systems. Neural engineers are uniquely qualified to solve design problems at the interface of living neural tissue and non-living constructs. Neural engineering can assist with numerous things, including the future development of prosthetics. For example, cognitive neural prosthetics (CNP) are being heavily researched and would allow for a chip implant to assist people who have prosthetics by providing signals to operate assistive devices. Pharmaceutical engineering. Pharmaceutical engineering is an interdisciplinary science that includes drug engineering, novel drug delivery and targeting, pharmaceutical technology, unit operations of chemical engineering, and pharmaceutical analysis. It may be deemed as a part of pharmacy due to its focus on the use of technology on chemical agents in providing better medicinal treatment. Hospital and medical devices. This is an "extremely broad category"—essentially covering all health care products that do not achieve their intended results through predominantly chemical (e.g., pharmaceuticals) or biological (e.g., vaccines) means, and do not involve metabolism. A medical device is intended for use in: Some examples include pacemakers, infusion pumps, the heart-lung machine, dialysis machines, artificial organs, implants, artificial limbs, corrective lenses, cochlear implants, ocular prosthetics, facial prosthetics, somato prosthetics, and dental implants. Stereolithography is a practical example of "medical modeling" being used to create physical objects. Beyond modeling organs and the human body, emerging engineering techniques are also currently used in the research and development of new devices for innovative therapies, treatments, patient monitoring, of complex diseases. Medical devices are regulated and classified (in the US) as follows (see also "Regulation"): Medical imaging. Medical/biomedical imaging is a major segment of medical devices. This area deals with enabling clinicians to directly or indirectly "view" things not visible in plain sight (such as due to their size, and/or location). This can involve utilizing ultrasound, magnetism, UV, radiology, and other means. Alternatively, navigation-guided equipment utilizes electromagnetic tracking technology, such as catheter placement into the brain or feeding tube placement systems. For example, ENvizion Medical's ENvue, an electromagnetic navigation system for enteral feeding tube placement. The system uses an external field generator and several EM passive sensors enabling scaling of the display to the patient's body contour, and a real-time view of the feeding tube tip location and direction, which helps the medical staff ensure the correct placement in the GI tract. Imaging technologies are often essential to medical diagnosis, and are typically the most complex equipment found in a hospital including: fluoroscopy, magnetic resonance imaging (MRI), nuclear medicine, positron emission tomography (PET), PET-CT scans, projection radiography such as X-rays and CT scans, tomography, ultrasound, optical microscopy, and electron microscopy. Medical implants. An implant is a kind of medical device made to replace and act as a missing biological structure (as compared with a transplant, which indicates transplanted biomedical tissue). The surface of implants that contact the body might be made of a biomedical material such as titanium, silicone or apatite depending on what is the most functional. In some cases, implants contain electronics, e.g. artificial pacemakers and cochlear implants. Some implants are bioactive, such as subcutaneous drug delivery devices in the form of implantable pills or drug-eluting stents. Bionics. Artificial body part replacements are one of the many applications of bionics. Concerned with the intricate and thorough study of the properties and function of human body systems, bionics may be applied to solve some engineering problems. Careful study of the different functions and processes of the eyes, ears, and other organs paved the way for improved cameras, television, radio transmitters and receivers, and many other tools. Biomedical sensors. In recent years biomedical sensors based in microwave technology have gained more attention. Different sensors can be manufactured for specific uses in both diagnosing and monitoring disease conditions, for example microwave sensors can be used as a complementary technique to X-ray to monitor lower extremity trauma. The sensor monitor the dielectric properties and can thus notice change in tissue (bone, muscle, fat etc.) under the skin so when measuring at different times during the healing process the response from the sensor will change as the trauma heals. Clinical engineering. Clinical engineering is the branch of biomedical engineering dealing with the actual implementation of medical equipment and technologies in hospitals or other clinical settings. Major roles of clinical engineers include training and supervising biomedical equipment technicians (BMETs), selecting technological products/services and logistically managing their implementation, working with governmental regulators on inspections/audits, and serving as technological consultants for other hospital staff (e.g. physicians, administrators, I.T., etc.). Clinical engineers also advise and collaborate with medical device producers regarding prospective design improvements based on clinical experiences, as well as monitor the progression of the state of the art so as to redirect procurement patterns accordingly. Their inherent focus on "practical" implementation of technology has tended to keep them oriented more towards "incremental"-level redesigns and reconfigurations, as opposed to revolutionary research & development or ideas that would be many years from clinical adoption; however, there is a growing effort to expand this time-horizon over which clinical engineers can influence the trajectory of biomedical innovation. In their various roles, they form a "bridge" between the primary designers and the end-users, by combining the perspectives of being both close to the point-of-use, while also trained in product and process engineering. Clinical engineering departments will sometimes hire not just biomedical engineers, but also industrial/systems engineers to help address operations research/optimization, human factors, cost analysis, etc. Also, see safety engineering for a discussion of the procedures used to design safe systems. The clinical engineering department is constructed with a manager, supervisor, engineer, and technician. One engineer per eighty beds in the hospital is the ratio. Clinical engineers are also authorized to audit pharmaceutical and associated stores to monitor FDA recalls of invasive items. Rehabilitation engineering. Rehabilitation engineering is the systematic application of engineering sciences to design, develop, adapt, test, evaluate, apply, and distribute technological solutions to problems confronted by individuals with disabilities. Functional areas addressed through rehabilitation engineering may include mobility, communications, hearing, vision, and cognition, and activities associated with employment, independent living, education, and integration into the community. While some rehabilitation engineers have master's degrees in rehabilitation engineering, usually a subspecialty of Biomedical engineering, most rehabilitation engineers have an undergraduate or graduate degrees in biomedical engineering, mechanical engineering, or electrical engineering. A Portuguese university provides an undergraduate degree and a master's degree in Rehabilitation Engineering and Accessibility. Qualification to become a Rehab' Engineer in the UK is possible via a University BSc Honours Degree course such as Health Design & Technology Institute, Coventry University. The rehabilitation process for people with disabilities often entails the design of assistive devices such as Walking aids intended to promote the inclusion of their users into the mainstream of society, commerce, and recreation. Regulatory issues. Regulatory issues have been constantly increased in the last decades to respond to the many incidents caused by devices to patients. For example, from 2008 to 2011, in US, there were 119 FDA recalls of medical devices classified as class I. According to U.S. Food and Drug Administration (FDA), Class I recall is associated to "a situation in which there is a reasonable probability that the use of, or exposure to, a product will cause serious adverse health consequences or death" Regardless of the country-specific legislation, the main regulatory objectives coincide worldwide. For example, in the medical device regulations, a product must be 1), safe 2), effective and 3), applicable to all the manufactured devices. A product is safe if patients, users, and third parties do not run unacceptable risks of physical hazards, such as injury or death, in its intended use. Protective measures must be introduced on devices that are hazardous to reduce residual risks at an acceptable level if compared with the benefit derived from the use of it. A product is effective if it performs as specified by the manufacturer in the intended use. Proof of effectiveness is achieved through clinical evaluation, compliance to performance standards or demonstrations of substantial equivalence with an already marketed device. The previous features have to be ensured for all the manufactured items of the medical device. This requires that a quality system shall be in place for all the relevant entities and processes that may impact safety and effectiveness over the whole medical device lifecycle. The medical device engineering area is among the most heavily regulated fields of engineering, and practicing biomedical engineers must routinely consult and cooperate with regulatory law attorneys and other experts. The Food and Drug Administration (FDA) is the principal healthcare regulatory authority in the United States, having jurisdiction over medical "devices, drugs, biologics, and combination" products. The paramount objectives driving policy decisions by the FDA are safety and effectiveness of healthcare products that have to be assured through a quality system in place as specified under 21 CFR 829 regulation. In addition, because biomedical engineers often develop devices and technologies for "consumer" use, such as physical therapy devices (which are also "medical" devices), these may also be governed in some respects by the Consumer Product Safety Commission. The greatest hurdles tend to be 510K "clearance" (typically for Class 2 devices) or pre-market "approval" (typically for drugs and class 3 devices). In the European context, safety effectiveness and quality is ensured through the "Conformity Assessment" which is defined as "the method by which a manufacturer demonstrates that its device complies with the requirements of the European Medical Device Directive". The directive specifies different procedures according to the class of the device ranging from the simple Declaration of Conformity (Annex VII) for Class I devices to EC verification (Annex IV), Production quality assurance (Annex V), Product quality assurance (Annex VI) and Full quality assurance (Annex II). The Medical Device Directive specifies detailed procedures for Certification. In general terms, these procedures include tests and verifications that are to be contained in specific deliveries such as the risk management file, the technical file, and the quality system deliveries. The risk management file is the first deliverable that conditions the following design and manufacturing steps. The risk management stage shall drive the product so that product risks are reduced at an acceptable level with respect to the benefits expected for the patients for the use of the device. The technical file contains all the documentation data and records supporting medical device certification. FDA technical file has similar content although organized in a different structure. The Quality System deliverables usually include procedures that ensure quality throughout all product life cycles. The same standard (ISO EN 13485) is usually applied for quality management systems in the US and worldwide. In the European Union, there are certifying entities named "Notified Bodies", accredited by the European Member States. The Notified Bodies must ensure the effectiveness of the certification process for all medical devices apart from the class I devices where a declaration of conformity produced by the manufacturer is sufficient for marketing. Once a product has passed all the steps required by the Medical Device Directive, the device is entitled to bear a CE marking, indicating that the device is believed to be safe and effective when used as intended, and, therefore, it can be marketed within the European Union area. The different regulatory arrangements sometimes result in particular technologies being developed first for either the U.S. or in Europe depending on the more favorable form of regulation. While nations often strive for substantive harmony to facilitate cross-national distribution, philosophical differences about the "optimal extent" of regulation can be a hindrance; more restrictive regulations seem appealing on an intuitive level, but critics decry the tradeoff cost in terms of slowing access to life-saving developments. RoHS II. Directive 2011/65/EU, better known as RoHS 2 is a recast of legislation originally introduced in 2002. The original EU legislation "Restrictions of Certain Hazardous Substances in Electrical and Electronics Devices" (RoHS Directive 2002/95/EC) was replaced and superseded by 2011/65/EU published in July 2011 and commonly known as RoHS 2. RoHS seeks to limit the dangerous substances in circulation in electronics products, in particular toxins and heavy metals, which are subsequently released into the environment when such devices are recycled. The scope of RoHS 2 is widened to include products previously excluded, such as medical devices and industrial equipment. In addition, manufacturers are now obliged to provide conformity risk assessments and test reports – or explain why they are lacking. For the first time, not only manufacturers but also importers and distributors share a responsibility to ensure Electrical and Electronic Equipment within the scope of RoHS complies with the hazardous substances limits and have a CE mark on their products. IEC 60601. The new International Standard IEC 60601 for home healthcare electro-medical devices defining the requirements for devices used in the home healthcare environment. IEC 60601-1-11 (2010) must now be incorporated into the design and verification of a wide range of home use and point of care medical devices along with other applicable standards in the IEC 60601 3rd edition series. The mandatory date for implementation of the EN European version of the standard is June 1, 2013. The US FDA requires the use of the standard on June 30, 2013, while Health Canada recently extended the required date from June 2012 to April 2013. The North American agencies will only require these standards for new device submissions, while the EU will take the more severe approach of requiring all applicable devices being placed on the market to consider the home healthcare standard. AS/NZS 3551:2012. AS/ANS 3551:2012 is the Australian and New Zealand standards for the management of medical devices. The standard specifies the procedures required to maintain a wide range of medical assets in a clinical setting (e.g. Hospital). The standards are based on the IEC 606101 standards. The standard covers a wide range of medical equipment management elements including, procurement, acceptance testing, maintenance (electrical safety and preventive maintenance testing) and decommissioning. Training and certification. Education. Biomedical engineers require considerable knowledge of both engineering and biology, and typically have a Bachelor's (B.Sc., B.S., B.Eng. or B.S.E.) or Master's (M.S., M.Sc., M.S.E., or M.Eng.) or a doctoral (Ph.D., or MD-PhD) degree in BME (Biomedical Engineering) or another branch of engineering with considerable potential for BME overlap. As interest in BME increases, many engineering colleges now have a Biomedical Engineering Department or Program, with offerings ranging from the undergraduate (B.Sc., B.S., B.Eng. or B.S.E.) to doctoral levels. Biomedical engineering has only recently been emerging as "its own discipline" rather than a cross-disciplinary hybrid specialization of other disciplines; and BME programs at all levels are becoming more widespread, including the Bachelor of Science in Biomedical Engineering which includes enough biological science content that many students use it as a "pre-med" major in preparation for medical school. The number of biomedical engineers is expected to rise as both a cause and effect of improvements in medical technology. In the U.S., an increasing number of undergraduate programs are also becoming recognized by ABET as accredited bioengineering/biomedical engineering programs. As of 2023, 155 programs are currently accredited by ABET. In Canada and Australia, accredited graduate programs in biomedical engineering are common. For example, McMaster University offers an M.A.Sc, an MD/PhD, and a PhD in Biomedical engineering. The first Canadian undergraduate BME program was offered at University of Guelph as a four-year B.Eng. program. The Polytechnique in Montreal is also offering a bachelors's degree in biomedical engineering as is Flinders University. As with many degrees, the reputation and ranking of a program may factor into the desirability of a degree holder for either employment or graduate admission. The reputation of many undergraduate degrees is also linked to the institution's graduate or research programs, which have some tangible factors for rating, such as research funding and volume, publications and citations. With BME specifically, the ranking of a university's hospital and medical school can also be a significant factor in the perceived prestige of its BME department/program. Graduate education is a particularly important aspect in BME. While many engineering fields (such as mechanical or electrical engineering) do not need graduate-level training to obtain an entry-level job in their field, the majority of BME positions do prefer or even require them. Since most BME-related professions involve scientific research, such as in pharmaceutical and medical device development, graduate education is almost a requirement (as undergraduate degrees typically do not involve sufficient research training and experience). This can be either a Masters or Doctoral level degree; while in certain specialties a Ph.D. is notably more common than in others, it is hardly ever the majority (except in academia). In fact, the perceived need for some kind of graduate credential is so strong that some undergraduate BME programs will actively discourage students from majoring in BME without an expressed intention to also obtain a master's degree or apply to medical school afterwards. Graduate programs in BME, like in other scientific fields, are highly varied, and particular programs may emphasize certain aspects within the field. They may also feature extensive collaborative efforts with programs in other fields (such as the university's Medical School or other engineering divisions), owing again to the interdisciplinary nature of BME. M.S. and Ph.D. programs will typically require applicants to have an undergraduate degree in BME, or "another engineering" discipline (plus certain life science coursework), or "life science" (plus certain engineering coursework). Education in BME also varies greatly around the world. By virtue of its extensive biotechnology sector, its numerous major universities, and relatively few internal barriers, the U.S. has progressed a great deal in its development of BME education and training opportunities. Europe, which also has a large biotechnology sector and an impressive education system, has encountered trouble in creating uniform standards as the European community attempts to supplant some of the national jurisdictional barriers that still exist. Recently, initiatives such as BIOMEDEA have sprung up to develop BME-related education and professional standards. Other countries, such as Australia, are recognizing and moving to correct deficiencies in their BME education. Also, as high technology endeavors are usually marks of developed nations, some areas of the world are prone to slower development in education, including in BME. Licensure/certification. As with other learned professions, each state has certain (fairly similar) requirements for becoming licensed as a registered Professional Engineer (PE), but, in US, in industry such a license is not required to be an employee as an engineer in the majority of situations (due to an exception known as the industrial exemption, which effectively applies to the vast majority of American engineers). The US model has generally been only to require the practicing engineers offering engineering services that impact the public welfare, safety, safeguarding of life, health, or property to be licensed, while engineers working in private industry without a direct offering of engineering services to the public or other businesses, education, and government need not be licensed. This is notably not the case in many other countries, where a license is as legally necessary to practice engineering as it is for law or medicine. Biomedical engineering is regulated in some countries, such as Australia, but registration is typically only recommended and not required. In the UK, mechanical engineers working in the areas of Medical Engineering, Bioengineering or Biomedical engineering can gain Chartered Engineer status through the Institution of Mechanical Engineers. The Institution also runs the Engineering in Medicine and Health Division. The Institute of Physics and Engineering in Medicine (IPEM) has a panel for the accreditation of MSc courses in Biomedical Engineering and Chartered Engineering status can also be sought through IPEM. The Fundamentals of Engineering exam – the first (and more general) of two licensure examinations for most U.S. jurisdictions—does now cover biology (although technically not BME). For the second exam, called the Principles and Practices, Part 2, or the Professional Engineering exam, candidates may select a particular engineering discipline's content to be tested on; there is currently not an option for BME with this, meaning that any biomedical engineers seeking a license must prepare to take this examination in another category (which does not affect the actual license, since most jurisdictions do not recognize discipline specialties anyway). However, the Biomedical Engineering Society (BMES) is, as of 2009, exploring the possibility of seeking to implement a BME-specific version of this exam to facilitate biomedical engineers pursuing licensure. Beyond governmental registration, certain private-sector professional/industrial organizations also offer certifications with varying degrees of prominence. One such example is the Certified Clinical Engineer (CCE) certification for Clinical engineers. Career prospects. In 2012 there were about 19,400 biomedical engineers employed in the US, and the field was predicted to grow by 5% (faster than average) from 2012 to 2022. Biomedical engineering has the highest percentage of female engineers compared to other common engineering professions. Now as of 2023, there are 19,700 jobs for this degree, the average pay for a person in this field is around $100,730.00 and making around $48.43 an hour. There is also expected to be a 7% increase in jobs from here 2023 to 2033 (even faster than the last average).
4829
7903804
https://en.wikipedia.org/wiki?curid=4829
Balkans
The Balkans ( , ), corresponding partially with the Balkan Peninsula, is a geographical area in southeastern Europe with various geographical and historical definitions. The region takes its name from the Balkan Mountains that stretch throughout the whole of Bulgaria. The Balkan Peninsula is bordered by the Adriatic Sea in the northwest, the Ionian Sea in the southwest, the Aegean Sea in the south, the Turkish straits in the east, and the Black Sea in the northeast. The northern border of the peninsula is variously defined. The highest point of the Balkans is Musala, , in the Rila mountain range, Bulgaria. The concept of the Balkan Peninsula was created by the German geographer August Zeune in 1808, who mistakenly considered the Balkan Mountains the dominant mountain system of southeastern Europe spanning from the Adriatic Sea to the Black Sea. In the 19th century the term "Balkan Peninsula" was a synonym for Rumelia, the parts of Europe that were provinces of the Ottoman Empire at the time. It had a geopolitical rather than a geographical definition, which was further promoted during the creation of Yugoslavia in the early 20th century. The definition of the Balkan Peninsula's natural borders does not coincide with the technical definition of a peninsula; hence modern geographers reject the idea of a Balkan Peninsula, while historical scholars usually discuss the Balkans as a region. The term has acquired a stigmatized and pejorative meaning related to the process of Balkanization. The region may alternatively be referred to as Southeast Europe. The borders of the Balkans are, due to many contrasting definitions, widely disputed, with no universal agreement on its components. By most definitions, the term fully encompasses Albania, Bosnia and Herzegovina, Bulgaria, Croatia (up to the Sava and Kupa rivers), mainland Greece, Kosovo, Montenegro, North Macedonia, Northern Dobruja in Romania, Serbia (up to the Danube river), and East Thrace in Turkey. However, many definitions also include the remaining territories of Croatia, Romania and Serbia, as well as Slovenia (up to the Kupa river). Additionally, some definitions include Hungary and Moldova due to cultural and historical factors. The province of Trieste in northeastern Italy, whilst by some definitions on the geographical peninsula, is generally excluded from the Balkans in a regional context. Name. Etymology. The origin of the word "Balkan" is obscure; it may be related to Turkish 'mud' (from Proto-Turkic *"bal" 'mud, clay; thick or gluey substance', cf. also Turkic 'honey'), and the Turkish suffix "-an" 'swampy forest' or Persian "bālā-khāna" 'big high house'. It was used mainly during the time of the Ottoman Empire. In both Ottoman Turkish and modern Turkish, "" means 'chain of wooded mountains'. Historical names and meaning. From antiquity to the early Middle Ages. The region that is nowadays known as the Balkans is largely the ancient (Europe's oldest) Danube civilisation, also referred to as the Old Europe civilization, and which peaked between 5000 and 3500 BC. From classical antiquity through the Middle Ages, the Balkan Mountains were called by the local Thracian name "Haemus". According to Greek mythology, the Thracian king Haemus was turned into a mountain by Zeus as a punishment and the mountain has remained with his name. A reverse name scheme has also been suggested. D. Dechev considers that Haemus (Αἷμος) is derived from a Thracian word "*saimon", 'mountain ridge'. A third possibility is that "Haemus" () derives from the Greek word "haima" () meaning 'blood'. The myth relates to a fight between Zeus and the monster/titan Typhon. Zeus injured Typhon with a thunder bolt and Typhon's blood fell on the mountains, giving them their name. Late Middle Ages and Ottoman period. The earliest mention of the name appears in an early 14th-century Arab map, in which the Haemus Mountains are referred to as "Balkan". The first attested time the name "Balkan" was used in the West for the mountain range in Bulgaria was in a letter sent in 1490 to Pope Innocent VIII by Filippo Buonaccorsi, an Italian humanist, writer and diplomat. The Ottomans first mention it in a document dated from 1565. There has been no other documented usage of the word to refer to the region before that, although other Turkic tribes had already settled in or were passing through the region. There is also a claim about an earlier Bulgar Turkic origin of the word popular in Bulgaria, however it is only an unscholarly assertion. The word was used by the Ottomans in Rumelia in its general meaning of mountain, as in "Kod̲j̲a-Balkan", "Čatal-Balkan", and "Ungurus-Balkani̊", but it was especially applied to the Haemus mountain. The name is still preserved in Central Asia with the Balkan Daglary (Balkan Mountains) and the Balkan Region of Turkmenistan. The English traveler John Bacon Sawrey Morritt introduced this term into English literature at the end of the 18th century, and other authors started applying the name to the wider area between the Adriatic and the Black Sea. The concept of the "Balkans" was created by the German geographer August Zeune in 1808, who mistakenly considered it as the dominant central mountain system of Southeast Europe spanning from the Adriatic Sea to the Black Sea. During the 1820s, "Balkan became the preferred although not yet exclusive term alongside Haemus among British travelers... Among Russian travelers not so burdened by classical toponymy, Balkan was the preferred term". In European books printed until late 1800s it was also known as Illyrian Peninsula or Illyrische Halbinsel in German. Evolution of meaning in the 19th and 20th centuries. The term was not commonly used in geographical literature until the mid-19th century because, already then, scientists like Carl Ritter warned that only the part south of the Balkan Mountains could be considered as a peninsula and considered it to be renamed as "Greek peninsula". Other prominent geographers who did not agree with Zeune were Hermann Wagner, Theobald Fischer, Marion Newbigin, and Albrecht Penck, while Austrian diplomat Johann Georg von Hahn, in 1869, for the same territory, used the term "Südosteuropäische Halbinsel" ('Southeastern European peninsula'). Another reason it was not commonly accepted as the definition of then European Turkey had a similar land extent. However, after the Congress of Berlin (1878) there was a political need for a new term and gradually "the Balkans" was revitalized, but in many maps, the northern border was in Serbia and Montenegro and Greece was not included (it only depicted the then Ottoman-occupied parts of Europe), while Yugoslavian maps also included Croatia and Bosnia. At the time, the "Balkan Peninsula" was also understood as a synonym for Rumelia or "European Turkey", and, in its broadest sense, encompassed the borders of all former Ottoman provinces in Europe. The usage of the term changed in the very end of the 19th and beginning of the 20th century, when it was embraced by Serbian geographers, most prominently by Jovan Cvijić. It was done with political reasoning as affirmation for Serbian nationalism on the whole territory of the South Slavs, and also included anthropological and ethnological studies of the South Slavs through which were claimed various nationalistic and racialist theories. Through such policies and Yugoslavian maps the term was elevated to the modern status of a geographical region. The term acquired political nationalistic connotations far from its initial geographic meaning, arising from political changes from the late 19th century to the creation of post–World War I Yugoslavia (initially the Kingdom of Serbs, Croats and Slovenes in 1918). After the dissolution of Yugoslavia beginning in June 1991, the term "Balkans" acquired a negative political meaning, especially in Croatia and Slovenia, as well in worldwide casual usage for war conflicts and fragmentation of territory. Southeast Europe. In part due to the historical and political connotations of the term "Balkans", especially since the military conflicts of the 1990s in Yugoslavia in the western half of the region, the term "Southeast Europe" is becoming increasingly popular. A European Union (EU) initiative of 1999 is called the "Stability Pact for Southeastern Europe". The online newspaper "Balkan Times" renamed itself "Southeast European Times" in 2003. Current. In other languages of the Balkans, the region or peninsula are known as: Definitions and boundaries. Balkan Peninsula. The Balkan Peninsula is bounded by the Adriatic Sea to the west, the Mediterranean Sea (including the Ionian and Aegean seas) and the Sea of Marmara to the south and the Black Sea to the east. Its northern boundary is subject to varying interpretations, but is often given as the Danube, Sava and Kupa Rivers. The Balkan Peninsula has a combined area of about . The peninsula is generally encompassed in the region known as Southeast Europe. Italy currently holds a small area around Trieste that is by some older definitions considered a part of the Balkan Peninsula. However, the regions of Trieste and Istria are not usually considered part of the peninsula by Italian geographers, due to their definition limiting its western border to the Kupa River. Balkans. The borders of the Balkans region are, due to a multitude of contrasting definitions, widely disputed, with no universal agreement on its components. By most definitions, it fully encompasses Albania, Bosnia and Herzegovina, Bulgaria, Croatia (up to the Sava and Kupa rivers), mainland Greece, Kosovo, Montenegro, North Macedonia, Northern Dobruja in Romania, Serbia (up to the Danube river) and East Thrace in Turkey. However, many definitions also include the remaining territories of Croatia, Romania and Serbia, as well as Slovenia (up to the Kupa river). Additionally, some definitions include Hungary and Moldova due to cultural and historical factors. The Province of Trieste in northeastern Italy, whilst by some definitions on the geographical peninsula, is generally excluded from the Balkans in a regional context. The term Southeast Europe may also be applied to the region, with various interpretations, although Balkan countries may alternatively be placed in Southern, Central or Eastern Europe. Turkey, including East Thrace, is generally placed in West Asia or the Middle East. Western Balkans. The "Western Balkans" is a political neologism coined to refer to Albania and the territory of the former Yugoslavia, except Slovenia, since the early 1990s. The region of the Western Balkans, a coinage exclusively used in pan-European parlance, roughly corresponds to the Dinaric Alps territory. The institutions of the EU have generally used the term "Western Balkans" to mean the Balkan area that includes countries that are not members of the EU, while others refer to the geographical aspects. Each of these countries aims to be part of the future enlargement of the EU and reach democracy and transmission scores but, until then, they will be strongly connected with the pre-EU waiting programme Central European Free Trade Agreement. Croatia, considered part of the Western Balkans, joined the EU in July 2013. Criticism as geographical definition. The term is scrutinised for having a geopolitical, rather than a geographical meaning and definition, as a multiethnic and political area in the southeastern part of Europe. The geographical term of a peninsula defines that the sea border must be longer than the land border, with the land side being the shortest in the triangle, but that is not the case for the Balkan Peninsula. Both the eastern and western sea catheti from Odesa to Cape Matapan (–1350 km) and from Trieste to Cape Matapan (–1285 km) are shorter than the land cathetus from Trieste to Odesa (–1365 km). The land has too long a land border to qualify as a peninsula – Szczecin (920 km) and Rostock (950 km) at the Baltic Sea are closer to Trieste than Odesa yet it is not considered as another European peninsula. Since the late 19th and early 20th century no exact northern border has been clear, with an issue, whether the rivers are usable for its definition. In studies the Balkans' natural borders, especially the northern border, are often avoided to be addressed, considered as a "problème fastidieux" (delicate problem) by André Blanc in "Géographie des Balkans" (1965), while John Lampe and Marvin Jackman in "Balkan Economic History" (1971) noted that "modern geographers seem agreed in rejecting the old idea of a Balkan Peninsula". Another issue is the name: the Balkan Mountains, mostly in Northern Bulgaria, do not dominate the region by length and area as do the Dinaric Alps. An eventual Balkan peninsula can be considered a territory south of the Balkan Mountains, with a possible name "Greek-Albanian Peninsula". The term influenced the meaning of Southeast Europe which again is not properly defined by geographical factors. Croatian geographers and academics are highly critical of inclusion of Croatia within the broad geographical, social-political and historical context of the Balkans, while the neologism Western Balkans is perceived as a humiliation of Croatia by the European political powers. According to M. S. Altić, the term has two different meanings, "geographical, ultimately undefined, and cultural, extremely negative, and recently strongly motivated by the contemporary political context". In 2018, President of Croatia Kolinda Grabar-Kitarović stated that the use of the term "Western Balkans" should be avoided because it does not imply only a geographic area, but also negative connotations, and instead must be perceived as and called Southeast Europe because it is part of Europe. Slovenian philosopher Slavoj Žižek said of the definition, Nature and natural resources. Most of the area is covered by mountain ranges running from the northwest to southeast. The main ranges are the Balkan Mountains (Stara Planina in Bulgarian language), running from the Black Sea coast in Bulgaria to the border with Serbia, the Rila-Rhodope massif in southern Bulgaria, the Dinaric Alps in Bosnia and Herzegovina, Croatia and Montenegro, the Korab-Šar mountains which spreads from Kosovo to Albania and North Macedonia, and the Pindus range, spanning from southern Albania into central Greece and the Albanian Alps, and the Alps at the northwestern border. The highest mountain of the region is Rila in Bulgaria, with Musala at 2,925 m, second being Mount Olympus in Greece, with Mytikas at 2,917 m, and Pirin mountain with Vihren, also in Bulgaria, being the third at 2915 m. The karst field or polje is a common feature of the landscape. On the Adriatic and Aegean coasts, the climate is Mediterranean, on the Black Sea coast the climate is humid subtropical and oceanic, and inland it is humid continental. In the northern part of the peninsula and on the mountains, winters are frosty and snowy, while summers are hot and dry. In the southern part, winters are milder. The humid continental climate is predominant in Bosnia and Herzegovina, northern Croatia, Bulgaria, Kosovo, northern Montenegro, the Republic of North Macedonia, and the interior of Albania and Serbia. Meanwhile, the other less common climates, the humid subtropical and oceanic climates, are seen on the Black Sea coast of Bulgaria and Balkan Turkey (European Turkey). The Mediterranean climate is seen on the Adriatic coasts of Albania, Croatia and Montenegro, as well as the Ionian coasts of Albania and Greece, in addition to the Aegean coasts of Greece and Balkan Turkey (European Turkey). Over the centuries, forests have been cut down and replaced with bush. In the southern part and on the coast there is evergreen vegetation. Inland there are woods typical of Central Europe (oak and beech, and in the mountains, spruce, fir and pine). The tree line in the mountains lies at the height of 1,800–2,300 m. The land provides habitats for numerous endemic species, including extraordinarily abundant insects and reptiles that serve as food for a variety of birds of prey and rare vultures. The soils are generally poor, except on the plains, where areas with natural grass, fertile soils and warm summers provide an opportunity for tillage. Elsewhere, land cultivation is mostly unsuccessful because of the mountains, hot summers and poor soils, although certain cultures such as olive and grape flourish. Resources of energy are scarce, except in Kosovo, where considerable coal, lead, zinc, chromium and silver deposits are located. Other deposits of coal, especially in Bulgaria, Serbia and Bosnia, also exist. Lignite deposits are widespread in Greece. Petroleum scarce reserves exist in Greece, Serbia and Albania. Natural gas deposits are scarce. Hydropower is in wide use, from over 1,000 dams. The often relentless bora wind is also being harnessed for power generation. Metal ores are more usual than other raw materials. Iron ore is rare, but in some countries there is a considerable amount of copper, zinc, tin, chromite, manganese, magnesite and bauxite. Some metals are exported. History and geopolitical significance. Antiquity. The Balkan region was the first area in Europe to experience the arrival of farming cultures in the Neolithic era. The Balkans have been inhabited since the Paleolithic and are the route by which farming from the Middle East spread to Europe during the Neolithic (7th millennium BC). The first known Neolithic culture of Old Europe was Kakanj culture that appeared in Central Bosnia's town of Kakanj and covered periods dated from 6795 to 4900 BC. The practices of growing grain and raising livestock arrived in the Balkans from the Fertile Crescent by way of Anatolia and spread west and north into Central Europe, particularly through Pannonia. Two early culture-complexes have developed in the region, Starčevo culture and Vinča culture. The Balkans are also the location of the first advanced civilizations. Vinča culture developed a form of proto-writing before the Sumerians and Minoans, known as the Old European script, while the bulk of the symbols had been created in the period between 4500 and 4000 BC, with the ones on the Tărtăria clay tablets even dating back to around 5300 BC. The identity of the Balkans is dominated by its geographical position; historically the area was known as a crossroads of cultures. It has been a juncture between the Latin and Greek bodies of the Roman Empire, the destination of a massive influx of pagan Bulgars and Slavs, an area where Orthodox and Catholic Christianity met, as well as the meeting point between Islam and Christianity. Albanic, Hellenic, and other Palaeo-Balkan languages, had their formative core in the Balkans after the Indo-European migrations in the region. In pre-classical and classical antiquity, this region was home to Greeks, Illyrians, Paeonians, Thracians, Dacians, and other ancient groups. The Achaemenid Persian Empire incorporated parts of the Balkans comprising Macedonia, Thrace (parts of present-day eastern Bulgaria), and the Black Sea coastal region of Romania beginning in 512 BC. Following the Persian defeat in the Greco-Persian Wars in 479 BC, they abandoned all of their European territories, which regained their independence. During the reign of Philip II of Macedon (359-336 BC), Macedonia rose to become the most powerful state in the Balkans. In the second century BC, the Roman Empire conquered the region and spread Roman culture and the Latin language, but significant parts still remained under classical Greek influence. The only Paleo-Balkan languages that survived are Albanian and Greek. The Romans considered the Rhodope Mountains to be the northern limit of the Peninsula of Haemus and the same limit applied approximately to the border between Greek and Latin use in the region (later called the Jireček Line). However large spaces south of Jireček Line were and are inhabited by Vlachs (Aromanians), the Romance-speaking heirs of Roman Empire. The Bulgars and Slavs arrived in the sixth-century and began assimilating and displacing already-assimilated (through Romanization and Hellenization) older inhabitants of the northern and central Balkans. This migration brought about the formation of distinct ethnic groups amongst the South Slavs, which included the Bulgarians, Croats and Serbs and Slovenes. Prior to the Slavic landing, parts of the western peninsula have been home to the Proto-Albanians. Including cities like Nish, Shtip, and Shkup. This can be proven through the development of the names, for example "Naissos" > "Nish" and "Astibos" > "Shtip" follow Albanian phonetic sound rules and have entered Slavic, indicating that Proto-Albanian was spoken in those cities prior to the Slavic invasion of the Balkans. Proto-Albanian speakers were Christianized under the Latin sphere of influence, specifically in the 4th century CE, as shown by the basic Christian terms in Albanian, which are of Latin origin and entered Proto-Albanian before the Gheg–Tosk dialectal diversification. Middle Ages and Early modern period. During the Early Middle Ages, The Byzantine Empire was the dominant state in the region, both military and culturally. Their cultural strength became particularly evident in the second half of the 9th century when the Byzantine missionaries Cyril and Methodius managed to spread the Byzantine variant of Christianity to the majority of the Balkans inhabitants who were pagan beforehand. Initially, it was adopted by the Bulgarians and Serbs, with the Romanians joining a bit later. The lack of Old Church Slavonic terms in Albanian Christian terminology shows that the missionary activities during the Christianization of the Slavs did not involve Albanian-speakers, indeed, the Christian belief among Albanians had survived through the centuries and already become an important cultural element in their ethnic identity. The emergence of the First Bulgarian Empire and the constant conflicts between the Byzantine Empire and the First Bulgarian Empire significantly weakened the Byzantine control over the Balkans by the end of the 10th century. The Byzantines further lost power in the Balkans after the resurgence of the Bulgarians in the late 12th century, with the forming of their Second Bulgarian Empire. After the collapse of the Second Bulgarian Empire, the Byzantine's Empire grip on power was prolonged by the inability of the Slavs to unite, which was caused by frequent infighting amongst themselves. Bulgaria in the first half of the 14th century was then overshadowed by the new rising regional power of Serbia, which was a result of Stefan Dušan rising up and conquering much of the Balkans to create the Serbian Empire. The Serbian and Byzantine empires continued to be the dominant forces in the region until the arrival of the Ottomans several decades later. Ottoman expansion in the region began in the second half of the 14th century, as the Byzantine Empire continued to lose its grip on the region after several defeats to the Ottomans. In 1362, the Ottoman Turks conquered Adrianople (now Edirne, Turkey). This was the start of their conquest of the Balkan Peninsula, which lasted for more than a century. Other states in the area starting falling like Serbia after the Siege of Smederevo in 1459, Bulgaria in 1396, Byzantine Empire in 1453, Bosnia in 1463, Herzegovina in 1482, and Montenegro in 1496. The conquest was made easier for the Ottomans due to existing divisions among the Orthodox peoples and by the even deeper rift that had existed at the time between the Eastern and Western Christians of Europe. The Albanians under Skanderbeg's leadership resisted the Ottomans for a time (1443–1468) by using guerilla warfare. Skanderbeg's achievements, in particular the Battle of Albulena and the First Siege of Krujë won him fame across Europe. The Ottomans eventually conquered the near entirety of the Balkans and reached central Europe by the early 16th century. Some smaller countries, such as Montenegro managed to retain some autonomy by managing their own internal affairs, since the territory was too mountainous to completely subdue. Another small country that retained its independence, both de facto and de jure in this case, was the Adriatic trading hub of Ragusa (now Dubrovnik, Croatia). By the end of the 16th century, the Ottoman Empire had become the controlling force in the region after expanding from Anatolia through Thrace to the Balkans. Many people in the Balkans place their greatest folk heroes in the era of either the onslaught or the retreat of the Ottoman Empire. As examples, for Greeks, Constantine XI Palaiologos and Kolokotronis; and for Serbs, Miloš Obilić, Tsar Lazar and Karađorđe; for Albanians, George Kastrioti Skanderbeg; for ethnic Macedonians, Nikola Karev and Goce Delčev; for Bulgarians, Vasil Levski, Georgi Sava Rakovski and Hristo Botev and for Croats, Nikola Šubić Zrinjski. In the past several centuries, because of the frequent Ottoman wars in Europe fought in and around the Balkans and the comparative Ottoman isolation from the mainstream of economic advance (reflecting the shift of Europe's commercial and political centre of gravity towards the Atlantic), the Balkans have been the least developed part of Europe. According to Halil İnalcık, "The population of the Balkans, according to one estimate, fell from a high of 8 million in the late 16th-century to only 3 million by the mid-eighteenth. This estimate is based on Ottoman documentary evidence". Most of the Balkan nation-states emerged during the 19th and early 20th centuries as they gained independence from the Ottoman or Habsburg empires: Greece in 1821, Serbia and Montenegro in 1878, Romania in 1881, Bulgaria in 1908 and Albania in 1912. Recent history. World wars. In 1912–1913, the First Balkan War broke out when the nation-states of Bulgaria, Serbia, Greece and Montenegro united in an alliance against the Ottoman Empire. As a result of the war, almost all remaining European territories of the Ottoman Empire were captured and partitioned among the allies. Ensuing events also led to the creation of an independent Albanian state. Bulgaria insisted on its status quo territorial integrity, divided and shared by the Great Powers next to the Russo-Turkish War (1877–78) in other boundaries and on the pre-war Bulgarian-Serbian agreement. Bulgaria was provoked by the backstage deals between its former allies, Serbia and Greece, on the allocation of the spoils at the end of the First Balkan War. At the time, Bulgaria was fighting at the main Thracian Front. Bulgaria marks the beginning of Second Balkan War when it attacked them. The Serbs and the Greeks repulsed single attacks, but when the Greek army invaded Bulgaria together with an unprovoked Romanian intervention in the back, Bulgaria collapsed. The Ottoman Empire used the opportunity to recapture Eastern Thrace, establishing its new western borders that still stand today as part of modern Turkey. World War I was sparked in the Balkans in 1914 when members of Young Bosnia, a revolutionary organization with predominantly Serb and pro-Yugoslav members, assassinated the Austro-Hungarian heir Archduke Franz Ferdinand of Austria in Bosnia and Herzegovina's capital, Sarajevo. That caused a war between Austria-Hungary and Serbia, which—through the existing chains of alliances—led to the World War I. The Ottoman Empire soon joined the Central Powers becoming one of the three empires participating in that alliance. The next year Bulgaria joined the Central Powers attacking Serbia, which was successfully fighting Austria-Hungary to the north for a year. That led to Serbia's defeat and the intervention of the Entente in the Balkans which sent an expeditionary force to establish a new front, the third one of that war, which soon also became static. The participation of Greece in the war three years later, in 1918, on the part of the Entente finally altered the balance between the opponents leading to the collapse of the common German-Bulgarian front there, which caused the exit of Bulgaria from the war, and in turn, the end of World War I and the collapse of the Austro-Hungarian Empire. Between the two wars, in order to maintain the geopolitical status quo in the region after the end of World War I, the Balkan Pact, or Balkan Entente, was formed by a treaty between Greece, Romania, Turkey and Yugoslavia on 9 February 1934 in Athens. With the start of the World War II, all Balkan countries, with the exception of Greece, were allies of Nazi Germany, having bilateral military agreements or being part of the Axis Pact. Fascist Italy expanded the war in the Balkans by using its protectorate Albania to invade Greece. After repelling the attack, the Greeks counterattacked, invading Italy-held Albania and causing Nazi Germany's intervention in the Balkans to help its ally. Days before the German invasion, a successful "coup d'état" in Belgrade by neutral military personnel seized power. Although the new government reaffirmed its intentions to fulfill its obligations as a member of the Axis, Germany, with Bulgaria, invaded both Greece and Yugoslavia. Yugoslavia immediately disintegrated when those loyal to the Serbian King and the Croatian units mutinied. Greece resisted, but, after two months of fighting, collapsed and was occupied. The two countries were partitioned between the three Axis allies, Bulgaria, Germany and Italy, and the Independent State of Croatia, a puppet state of Italy and Germany. During the occupation, the population suffered considerable hardship due to repression and starvation, to which the population reacted by creating a mass resistance movement. Together with the early and extremely heavy winter of that year (which caused hundreds of thousands of deaths among the poorly fed population), the German invasion had disastrous effects in the timetable of the planned invasion in Russia causing a significant delay, which had major consequences during the course of the war. Finally, at the end of 1944, the Soviets entered Romania and Bulgaria forcing the Germans out of the Balkans. They left behind a region largely ruined as a result of wartime exploitation. Cold War. During the Cold War, most of the countries on the Balkans were governed by communist governments. Greece became the first battleground of the emerging Cold War. The Truman Doctrine was the US response to the civil war, which raged from 1944 to 1949. This civil war, unleashed by the Communist Party of Greece, backed by communist volunteers from neighboring countries (Albania, Bulgaria and Yugoslavia), led to massive American assistance for the non-communist Greek government. With this backing, Greece managed to defeat the partisans and, ultimately, remained one of the two only non-communist countries in the region with Turkey. However, despite being under communist governments, Yugoslavia (1948) and Albania (1961) fell out with the Soviet Union. Yugoslavia, led by Marshal Josip Broz Tito (1892–1980), first propped up then rejected the idea of merging with Bulgaria and instead sought closer relations with the West, later even spearheaded, together with India and Egypt the Non-Aligned Movement. Albania on the other hand gravitated toward Communist China, later adopting an isolationist position. On 28 February 1953, Greece, Turkey and Yugoslavia signed the treaty of Agreement of Friendship and Cooperation in Ankara to form the Balkan Pact of 1953. The treaty's aim was to deter Soviet expansion in the Balkans and eventual creation of a joint military staff for the three countries. When the pact was signed, Turkey and Greece were members of the NATO, while Yugoslavia was a non-aligned communist state. With the Pact, Yugoslavia was able to indirectly associate itself with NATO. Though it was planned for the pact to remain in force for 20 years, it dissolved in 1960. As the only non-communist countries, Greece and Turkey were (and still are) part of NATO composing the southeastern wing of the alliance. Post–Cold War. In the 1990s, the transition of the regions' ex-Eastern bloc countries towards democratic free-market societies went peacefully. While in the non-aligned Yugoslavia, Wars between the former Yugoslav republics broke out after Slovenia and Croatia held free elections and their people voted for independence on their respective countries' referendums. Serbia, in turn, declared the dissolution of the union as unconstitutional and the Yugoslav People's Army unsuccessfully tried to maintain the status quo. Slovenia and Croatia declared independence on 25 June 1991, which prompted the Croatian War of Independence in Croatia and the Ten-Day War in Slovenia. The Yugoslav forces eventually withdrew from Slovenia in 1991 while the war in Croatia continued until late 1995. The two were followed by Macedonia and later Bosnia and Herzegovina, with Bosnia being the most affected by the fighting. The wars prompted the United Nations' intervention and NATO ground and air forces took action against Serb forces in Bosnia and Herzegovina and FR Yugoslavia (i.e. Serbia and Montenegro). From the dissolution of Yugoslavia, six states achieved internationally recognized sovereignty: Slovenia, Croatia, Bosnia and Herzegovina, North Macedonia, Montenegro and Serbia; all of them are traditionally included in the Balkans which is often a controversial matter of dispute. In 2008, while under UN administration, Kosovo declared independence (according to the official Serbian policy, Kosovo is still an internal autonomous region). In July 2010, the International Court of Justice, ruled that the declaration of independence was legal. Most UN member states recognise Kosovo. After the end of the wars a revolution broke in Serbia and Slobodan Milošević, the Serbian communist leader (elected president between 1989 and 2000), was overthrown and handed for a trial to the International Criminal Tribunal for crimes against the International Humanitarian Law during the Yugoslav wars. Milošević died of a heart attack in 2006 before a verdict could have been released. Ιn 2001 an Albanian uprising in Macedonia (North Macedonia) forced the country to give local autonomy to the ethnic Albanians in the areas where they predominate. With the dissolution of Yugoslavia, an issue emerged over the name under which the former (federated) republic of Macedonia would internationally be recognized, between the new country and Greece. Being the Macedonian part of Yugoslavia (see Vardar Macedonia), the federated republic under the Yugoslav identity had the name (Socialist) Republic of Macedonia on which it declared its sovereignty in 1991. Greece, having a large homonymous region (see Macedonia), opposed the usage of the name as an indication of a nationality and ethnicity. Thus dubbed Macedonia naming dispute was resolved under UN mediation in the June 2018 Prespa agreement was reached, which saw the country's renaming into North Macedonia in 2019. Balkan countries control the direct land routes between Western Europe and South-West Asia (Asia Minor and the Middle East). Since 2000, all Balkan countries are friendly towards the EU and the US. Greece has been a member of the EU since 1981, while Slovenia is a member since 2004, Bulgaria and Romania are members since 2007, and Croatia is a member since 2013. In 2005, the EU decided to start accession negotiations with candidate countries; Turkey, and North Macedonia were accepted as candidates for EU membership. In 2012, Montenegro started accession negotiations with the EU. In 2014, Albania is an official candidate for accession to the EU. In 2015, Serbia was expected to start accession negotiations with the EU, however this process has been stalled over the recognition of Kosovo as an independent state by existing EU member states. Greece and Turkey have been NATO members since 1952. In March 2004, Bulgaria, Romania and Slovenia have become members of NATO. As of April 2009, Albania and Croatia are members of NATO. Montenegro joined in June 2017. The most recent member state to be added to NATO was North Macedonia on 27 March 2020. Almost all other countries have expressed a desire to join the EU, NATO, or both at some point in the future. Economy. Currently, all of the states are republics, but until World War II all countries were monarchies. Most of the republics are parliamentary, excluding Romania and Bosnia which are semi-presidential. All the states have open market economies, most of which are in the upper-middle-income range ($4,000–12,000 p.c.), except Croatia, Romania, Greece, and Slovenia that have high income economies (over $12,000 p.c.), and are classified with very high HDI, along with Bulgaria, in contrast to the remaining states, which are classified with high HDI. The states from the former Eastern Bloc that formerly had planned economy system and Turkey mark gradual economic growth each year. The gross domestic product per capita is highest in Slovenia (over $29,000), followed by Croatia and Greece (~$20,000), Romania, Bulgaria (over $11,000), Turkey, Montenegro, Serbia (between $10,000 and $9,000), and Bosnia and Herzegovina, Albania, North Macedonia (~$7,000) and Kosovo ($5,000). The Gini coefficient, which indicates the level of difference by monetary welfare of the layers, is on the second level at the highest monetary equality in Albania, Bulgaria, and Serbia, on the third level in Greece, Montenegro and Romania, on the fourth level in North Macedonia, on the fifth level in Turkey, and the most unequal by Gini coefficient is Bosnia at the eighth level which is the penultimate level and one of the highest in the world. The unemployment is lowest in Romania and Bulgaria (around 5%), followed by Serbia and Albania (11–12%), Turkey, Greece, Bosnia, North Macedonia (13–16%), Montenegro (~18%), and Kosovo (~25%). As nations in the Western Balkans opened up to private investment in the 1990s, newly created enterprises (mostly SMEs) fueled regional economic development by facilitating the transition from a massive state-owned structure to a market economy. SMEs now account for 99% of all active businesses, up to 81% of total value created, and 72% of total employment in the Western Balkans. The Western Balkans are mostly bank-based economies, with bank credit serving as the primary source of external capital for all enterprises, including SMEs. Despite this, the region's bank credit supply is limited and undeveloped. A recent analysis from the European Investment Bank estimated the funding deficit to be at US$2.8 billion, or around 2.5% of nominal GDP. In most Western Balkan markets, international banks have a market share of 70% to 90%. At the end of 2023, the macroeconomic environment in the Western Balkans indicates that risks are increasing, threatening to worsen the financial imbalance. Recent survey findings give conflicting data on enterprises' funding circumstances. While supply has fallen as a result of the COVID-19 pandemic and interest rate increasers, it has showed progressive recovery. Demographics. The region is inhabited by Albanians, Aromanians, Bulgarians, Bosniaks, Croats, Gorani, Greeks, Istro-Romanians, Macedonians, Hungarians, Megleno-Romanians, Montenegrins, Serbs, Slovenes, Romanians, Turks, and other ethnic groups which present minorities in certain countries like the Romani and Ashkali. Religion. The region is a meeting point of Orthodox Christianity, Islam and Roman Catholic Christianity. Eastern Orthodoxy is the majority religion in both the Balkan Peninsula and the Balkan region, The Eastern Orthodox Church has played a prominent role in the history and culture of Eastern and Southeastern Europe. A variety of different traditions of each faith are practiced, with each of the Eastern Orthodox countries having its own national church. A part of the population in the Balkans defines itself as irreligious. Islam has a significant history in the region where Muslims make up a large percentage of the population. A 2013 estimate placed the total Muslim population of the Balkans at around eight million. Islam is the largest religion in nations like Albania, Bosnia-Herzegovina, and Kosovo with significant minorities in Bulgaria, North Macedonia and Montenegro. Smaller populations of Muslims are also found in Romania, Serbia and Greece. The Jewish communities of the Balkans were some of the oldest in Europe and date back to ancient times. These communities were Sephardi Jews, except in Croatia and Slovenia, where the Jewish communities were mainly Ashkenazi Jews. In Bosnia and Herzegovina, the small and close-knit Jewish community is 90% Sephardic, and Ladino is still spoken among the elderly. The Sephardi Jewish cemetery in Sarajevo has tombstones of a unique shape and inscribed in ancient Ladino. Sephardi Jews used to have a large presence in the city of Thessaloniki, and by 1900, some 80,000, or more than half of the population, were Jews. The Jewish communities in the Balkans suffered immensely during World War II, and the vast majority were killed during the Holocaust. An exception were the Bulgarian Jews who Boris III of Bulgaria sent to forced labor camps instead of Nazi concentration camps. Almost all of the few survivors have emigrated to the (then) newly founded state of Israel and elsewhere. Almost no Balkan country today has a significant Jewish minority. Languages. The Balkan region today is a very diverse ethnolinguistic region, being home to multiple Slavic and Romance languages, as well as Albanian, Greek, Turkish, Hungarian and others. Romani is spoken by a large portion of the Romanis living throughout the Balkan countries. Throughout history, many other ethnic groups with their own languages lived in the area, among them Thracians, Illyrians, Romans, Celts and various Germanic tribes. All of the aforementioned languages from the present and from the past belong to the wider Indo-European language family, with the exception of the Turkic languages (e.g., Turkish and Gagauz) and Hungarian. Urbanization. Most of the states in the Balkans are predominantly urbanized, with the lowest number of urban population as % of the total population found in Bosnia and Herzegovina at 49%, Kosovo at 50% and Slovenia at 55%. A list of largest cities: Only the European part of Istanbul is a part of the Balkans. It is home to two-thirds of the city's 15,519,267 inhabitants. Time zones. The time zones in the Balkans are defined as the following:
4831
744356
https://en.wikipedia.org/wiki?curid=4831
Bohr model
In atomic physics, the Bohr model or Rutherford–Bohr model was a model of the atom that incorporated some early quantum concepts. Developed from 1911 to 1918 by Niels Bohr and building on Ernest Rutherford's nuclear model, it supplanted the plum pudding model of J. J. Thomson only to be replaced by the quantum atomic model in the 1920s. It consists of a small, dense nucleus surrounded by orbiting electrons. It is analogous to the structure of the Solar System, but with attraction provided by electrostatic force rather than gravity, and with the electron energies quantized (assuming only discrete values). In the history of atomic physics, it followed, and ultimately replaced, several earlier models, including Joseph Larmor's Solar System model (1897), Jean Perrin's model (1901), the cubical model (1902), Hantaro Nagaoka's Saturnian model (1904), the plum pudding model (1904), Arthur Haas's quantum model (1910), the Rutherford model (1911), and John William Nicholson's nuclear quantum model (1912). The improvement over the 1911 Rutherford model mainly concerned the new quantum mechanical interpretation introduced by Haas and Nicholson, but forsaking any attempt to explain radiation according to classical physics. The model's key success lies in explaining the Rydberg formula for hydrogen's spectral emission lines. While the Rydberg formula had been known experimentally, it did not gain a theoretical basis until the Bohr model was introduced. Not only did the Bohr model explain the reasons for the structure of the Rydberg formula, it also provided a justification for the fundamental physical constants that make up the formula's empirical results. The Bohr model is a relatively primitive model of the hydrogen atom, compared to the valence shell model. As a theory, it can be derived as a first-order approximation of the hydrogen atom using the broader and much more accurate quantum mechanics and thus may be considered to be an obsolete scientific theory. However, because of its simplicity, and its correct results for selected systems (see below for application), the Bohr model is still commonly taught to introduce students to quantum mechanics or energy level diagrams before moving on to the more accurate, but more complex, valence shell atom. A related quantum model was proposed by Arthur Erich Haas in 1910 but was rejected until the 1911 Solvay Congress where it was thoroughly discussed. The quantum theory of the period between Planck's discovery of the quantum (1900) and the advent of a mature quantum mechanics (1925) is often referred to as the "old quantum theory". Background. Until the second decade of the 20th century, atomic models were generally speculative. Even the concept of atoms, let alone atoms with internal structure, faced opposition from some scientists. Planetary models. In the late 1800s speculations on the possible structure of the atom included planetary models with orbiting charged electrons. These models faced a significant constraint. In 1897, Joseph Larmor showed that an accelerating charge would radiate power according to classical electrodynamics, a result known as the Larmor formula. Since electrons forced to remain in orbit are continuously accelerating, they would be mechanically unstable. Larmor noted that electromagnetic effect of multiple electrons, suitable arranged, would cancel each other. Thus subsequent atomic models based on classical electrodynamics needed to adopt such special multi-electron arrangements. Thomson's atom model. When Bohr began his work on a new atomic theory in the summer of 1912 the atomic model proposed by J. J. Thomson, now known as the plum pudding model, was the best available. Thomson proposed a model with electrons rotating in coplanar rings within an atomic-sized, positively-charged, spherical volume. Thomson showed that this model was mechanically stable by lengthy calculations and was electrodynamically stable under his original assumption of thousands of electrons per atom. Moreover, he suggested that the particularly stable configurations of electrons in rings was connected to chemical properties of the atoms. He developed a formula for the scattering of beta particles that seemed to match experimental results. However Thomson himself later showed that the atom had a factor of a thousand fewer electrons, challenging the stability argument and forcing the poorly understood positive sphere to have most of the atom's mass. Thomson was also unable to explain the many lines in atomic spectra. Rutherford nuclear model. In 1908, Hans Geiger and Ernest Marsden demonstrated that alpha particle occasionally scatter at large angles, a result inconsistent with Thomson's model. In 1911 Ernest Rutherford developed a new scattering model, showing that the observed large angle scattering could be explained by a compact, highly charged mass at the center of the atom. Rutherford scattering did not involve the electrons and thus his model of the atom was incomplete. Bohr begins his first paper on his atomic model by describing Rutherford's atom as consisting of a small, dense, positively charged nucleus attracting negatively charged electrons. Atomic spectra. By the early twentieth century, it was expected that the atom would account for the many atomic spectral lines. These lines were summarized in empirical formula by Johann Balmer and Johannes Rydberg. In 1897, Lord Rayleigh showed that vibrations of electrical systems predicted spectral lines that depend on the square of the vibrational frequency, contradicting the empirical formula which depended directly on the frequency. In 1907 Arthur W. Conway showed that, rather than the entire atom vibrating, vibrations of only one of the electrons in the system described by Thomson might be sufficient to account for spectral series. Although Bohr's model would also rely on just the electron to explain the spectrum, he did not assume an electrodynamical model for the atom. The other important advance in the understanding of atomic spectra was the Rydberg–Ritz combination principle which related atomic spectral line frequencies to differences between 'terms', special frequencies characteristic of each element. Bohr would recognize the terms as energy levels of the atom divided by the Planck constant, leading to the modern view that the spectral lines result from energy differences. Haas atomic model. In 1910, Arthur Erich Haas proposed a model of the hydrogen atom with an electron circulating on the surface of a sphere of positive charge. The model resembled Thomson's plum pudding model, but Haas added a radical new twist: he constrained the electron's potential energy, formula_1, on a sphere of radius to equal the frequency, , of the electron's orbit on the sphere times the Planck constant: formula_2 where represents the charge on the electron and the sphere. Haas combined this constraint with the balance-of-forces equation. The attractive force between the electron and the sphere balances the centrifugal force: formula_3 where is the mass of the electron. This combination relates the radius of the sphere to the Planck constant: formula_4 Haas solved for the Planck constant using the then-current value for the radius of the hydrogen atom. Three years later, Bohr would use similar equations with different interpretation. Bohr took the Planck constant as given value and used the equations to predict, , the radius of the electron orbiting in the ground state of the hydrogen atom. This value is now called the Bohr radius. Influence of the Solvay Conference. The first Solvay Conference, in 1911, was one of the first international physics conferences. Nine Nobel or future Nobel laureates attended, including Ernest Rutherford, Bohr's mentor. Bohr did not attend but he read the Solvay reports and discussed them with Rutherford. The subject of the conference was the theory of radiation and the energy quanta of Max Planck's oscillators. Planck's lecture at the conference ended with comments about atoms and the discussion that followed it concerned atomic models. Hendrik Lorentz raised the question of the composition of the atom based on Haas's model, a form of Thomson's plum pudding model with a quantum modification. Lorentz explained that the size of atoms could be taken to determine the Planck constant as Haas had done or the Planck constant could be taken as determining the size of atoms. Bohr would adopt the second path. The discussions outlined the need for the quantum theory to be included in the atom. Planck explicitly mentions the failings of classical mechanics. While Bohr had already expressed a similar opinion in his PhD thesis, at Solvay the leading scientists of the day discussed a break with classical theories. Bohr's first paper on his atomic model cites the Solvay proceedings saying: "Whatever the alteration in the laws of motion of the electrons may be, it seems necessary to introduce in the laws in question a quantity foreign to the classical electrodynamics, "i.e." Planck's constant, or as it often is called the elementary quantum of action." Encouraged by the Solvay discussions, Bohr would assume the atom was stable and abandon the efforts to stabilize classical models of the atom Nicholson atom theory. In 1911 John William Nicholson published a model of the atom which would influence Bohr's model. Nicholson developed his model based on the analysis of astrophysical spectroscopy. He connected the observed spectral line frequencies with the orbits of electrons in his atoms. The connection he adopted associated the atomic electron orbital angular momentum with the Planck constant. Whereas Planck focused on a quantum of energy, Nicholson's angular momentum quantum relates to orbital frequency. This new concept gave Planck constant an atomic meaning for the first time. In his 1913 paper Bohr cites Nicholson as finding quantized angular momentum important for the atom. The other critical influence of Nicholson work was his detailed analysis of spectra. Before Nicholson's work Bohr thought the spectral data was not useful for understanding atoms. In comparing his work to Nicholson's, Bohr came to understand the spectral data and their value. When he then learned from a friend about Balmer's compact formula for the spectral line data, Bohr quickly realized his model would match it in detail. Nicholson's model was based on classical electrodynamics along the lines of J.J. Thomson's plum pudding model but his negative electrons orbiting a positive nucleus rather than circulating in a sphere. To avoid immediate collapse of this system he required that electrons come in pairs so the rotational acceleration of each electron was matched across the orbit. By 1913 Bohr had already shown, from the analysis of alpha particle energy loss, that hydrogen had only a single electron not a matched pair. Bohr's atomic model would abandon classical electrodynamics. Nicholson's model of radiation was quantum but was attached to the orbits of the electrons. Bohr quantization would associate it with differences in energy levels of his model of hydrogen rather than the orbital frequency. Bohr's previous work. Bohr completed his PhD in 1911 with a thesis 'Studies on the Electron Theory of Metals', an application of the classical electron theory of Hendrik Lorentz. Bohr noted two deficits of the classical model. The first concerned the specific heat of metals which James Clerk Maxwell noted in 1875: every additional degree of freedom in a theory of metals, like subatomic electrons, cause more disagreement with experiment. The second, the classical theory could not explain magnetism. After his PhD, Bohr worked briefly in the lab of JJ Thomson before moving to Rutherford's lab in Manchester to study radioactivity. He arrived just after Rutherford completed his proposal of a compact nuclear core for atoms. Charles Galton Darwin, also at Manchester, had just completed an analysis of alpha particle energy loss in metals, concluding the electron collisions were the dominant cause of loss. Bohr showed in a subsequent paper that Darwin's results would improve by accounting for electron binding energy. Importantly this allowed Bohr to conclude that hydrogen atoms have a single electron. Development. Next, Bohr was told by his friend, Hans Hansen, that the Balmer series is calculated using the Balmer formula, an empirical equation discovered by Johann Balmer in 1885 that described wavelengths of some spectral lines of hydrogen. This was further generalized by Johannes Rydberg in 1888, resulting in what is now known as the Rydberg formula. After this, Bohr declared, "everything became clear". In 1913 Niels Bohr put forth three postulates to provide an electron model consistent with Rutherford's nuclear model: Other points are: Bohr's condition, that the angular momentum be an integer multiple of formula_23, was later reinterpreted in 1924 by de Broglie as a standing wave condition: the electron is described by a wave and a whole number of wavelengths must fit along the circumference of the electron's orbit: formula_24 According to de Broglie's hypothesis, matter particles such as the electron behave as waves. The of an electron is formula_25 which implies that formula_26 or formula_27 where formula_28 is the angular momentum of the orbiting electron. Writing formula_29 for this angular momentum, the previous equation becomes formula_30 which is Bohr's second postulate. Bohr described angular momentum of the electron orbit as formula_31 while de Broglie's wavelength of formula_32 described formula_11 divided by the electron momentum. In 1913, however, Bohr justified his rule by appealing to the correspondence principle, without providing any sort of wave interpretation. In 1913, the wave behavior of matter particles such as the electron was not suspected. In 1925, a new kind of mechanics was proposed, quantum mechanics, in which Bohr's model of electrons traveling in quantized orbits was extended into a more accurate model of electron motion. The new theory was proposed by Werner Heisenberg. Another form of the same theory, wave mechanics, was discovered by the Austrian physicist Erwin Schrödinger independently, and by different reasoning. Schrödinger employed de Broglie's matter waves, but sought wave solutions of a three-dimensional wave equation describing electrons that were constrained to move about the nucleus of a hydrogen-like atom, by being trapped by the potential of the positive nuclear charge. Electron energy levels. The Bohr model gives almost exact results only for a system where two charged points orbit each other at speeds much less than that of light. This not only involves one-electron systems such as the hydrogen atom, singly ionized helium, and doubly ionized lithium, but it includes positronium and Rydberg states of any atom where one electron is far away from everything else. It can be used for K-line X-ray transition calculations if other assumptions are added (see Moseley's law below). In high energy physics, it can be used to calculate the masses of heavy quark mesons. Calculation of the orbits requires two assumptions. Derivation. In classical mechanics, if an electron is orbiting around an atom with period T, and if its coupling to the electromagnetic field is weak, so that the orbit doesn't decay very much in one cycle, it will emit electromagnetic radiation in a pattern repeating at every period, so that the Fourier transform of the pattern will only have frequencies which are multiples of 1/T. However, in quantum mechanics, the quantization of angular momentum leads to discrete energy levels of the orbits, and the emitted frequencies are quantized according to the energy differences between these levels. This discrete nature of energy levels introduces a fundamental departure from the classical radiation law, giving rise to distinct spectral lines in the emitted radiation. Bohr assumes that the electron is circling the nucleus in an elliptical orbit obeying the rules of classical mechanics, but with no loss of radiation due to the Larmor formula. Denoting the total energy as "E", the electron charge as −"e", the nucleus charge as , the electron mass as "m"e, half the major axis of the ellipse as "a", he starts with these equations: formula_38 formula_39 "E" is assumed to be negative, because a positive energy is required to unbind the electron from the nucleus and put it at rest at an infinite distance. Eq. (1a) is obtained from equating the centripetal force to the Coulombian force acting between the nucleus and the electron, considering that formula_40 (where "T" is the average kinetic energy and "U" the average electrostatic potential), and that for Kepler's second law, the average separation between the electron and the nucleus is "a". Eq. (1b) is obtained from the same premises of eq. (1a) plus the virial theorem, stating that, for an elliptical orbit, formula_41 Then Bohr assumes that formula_42 is an integer multiple of the energy of a quantum of light with half the frequency of the electron's revolution frequency, i.e.: formula_43 From eq. (1a, 1b, 2), it descends: formula_44 formula_45 formula_46 He further assumes that the orbit is circular, i.e. formula_47, and, denoting the angular momentum of the electron as "L", introduces the equation: formula_48 Eq. (4) stems from the virial theorem, and from the classical mechanics relationships between the angular momentum, the kinetic energy and the frequency of revolution. From eq. (1c, 2, 4), it stems: formula_49 where: formula_50 that is: formula_51 This results states that the angular momentum of the electron is an integer multiple of the reduced Planck constant. Substituting the expression for the velocity gives an equation for "r" in terms of "n": formula_52 so that the allowed orbit radius at any "n" is formula_53 The smallest possible value of "r" in the hydrogen atom () is called the Bohr radius and is equal to: formula_54 The energy of the "n"-th level for any atom is determined by the radius and quantum number: formula_55 An electron in the lowest energy level of hydrogen () therefore has about 13.6 eV less energy than a motionless electron infinitely far from the nucleus. The next energy level () is −3.4 eV. The third () is −1.51 eV, and so on. For larger values of "n", these are also the binding energies of a highly excited atom with one electron in a large circular orbit around the rest of the atom. The hydrogen formula also coincides with the Wallis product. The combination of natural constants in the energy formula is called the Rydberg energy ("R"E): formula_56 This expression is clarified by interpreting it in combinations that form more natural units: formula_57 is the rest mass energy of the electron (511 keV), formula_58 is the fine-structure constant, formula_59. Since this derivation is with the assumption that the nucleus is orbited by one electron, we can generalize this result by letting the nucleus have a charge , where "Z" is the atomic number. This will now give us energy levels for hydrogenic (hydrogen-like) atoms, which can serve as a rough order-of-magnitude approximation of the actual energy levels. So for nuclei with "Z" protons, the energy levels are (to a rough approximation): formula_60 The actual energy levels cannot be solved analytically for more than one electron (see "n"-body problem) because the electrons are not only affected by the nucleus but also interact with each other via the Coulomb force. When "Z" = 1/"α" (), the motion becomes highly relativistic, and 2 cancels the "α"2 in "R"; the orbit energy begins to be comparable to rest energy. Sufficiently large nuclei, if they were stable, would reduce their charge by creating a bound electron from the vacuum, ejecting the positron to infinity. This is the theoretical phenomenon of electromagnetic charge screening which predicts a maximum nuclear charge. Emission of such positrons has been observed in the collisions of heavy ions to create temporary super-heavy nuclei. The Bohr formula properly uses the reduced mass of electron and proton in all situations, instead of the mass of the electron, formula_61 However, these numbers are very nearly the same, due to the much larger mass of the proton, about 1836.1 times the mass of the electron, so that the reduced mass in the system is the mass of the electron multiplied by the constant . This fact was historically important in convincing Rutherford of the importance of Bohr's model, for it explained the fact that the frequencies of lines in the spectra for singly ionized helium do not differ from those of hydrogen by a factor of exactly 4, but rather by 4 times the ratio of the reduced mass for the hydrogen vs. the helium systems, which was much closer to the experimental ratio than exactly 4. For positronium, the formula uses the reduced mass also, but in this case, it is exactly the electron mass divided by 2. For any value of the radius, the electron and the positron are each moving at half the speed around their common center of mass, and each has only one fourth the kinetic energy. The total kinetic energy is half what it would be for a single electron moving around a heavy nucleus. formula_62 (positronium). Rydberg formula. Beginning in late 1860s, Johann Balmer and later Johannes Rydberg and Walther Ritz developed increasingly accurate empirical formula matching measured atomic spectral lines. Critical for Bohr's later work, Rydberg expressed his formula in terms of wave-number, equivalent to frequency. These formula contained a constant, formula_63, now known the Rydberg constant and a pair of integers indexing the lines: formula_64 Despite many attempts, no theory of the atom could reproduce these relatively simple formula. In Bohr's theory describing the energies of transitions or quantum jumps between orbital energy levels is able to explain these formula. For the hydrogen atom Bohr starts with his derived formula for the energy released as a free electron moves into a stable circular orbit indexed by formula_65: formula_66 The energy difference between two such levels is then: formula_67 Therefore, Bohr's theory gives the Rydberg formula and moreover the numerical value the Rydberg constant for hydrogen in terms of more fundamental constants of nature, including the electron's charge, the electron's mass, and the Planck constant: formula_68 Since the energy of a photon is formula_69 these results can be expressed in terms of the wavelength of the photon given off: formula_70 Bohr's derivation of the Rydberg constant, as well as the concomitant agreement of Bohr's formula with experimentally observed spectral lines of the Lyman ( = 1), Balmer ( = 2), and Paschen ( = 3) series, and successful theoretical prediction of other lines not yet observed, was one reason that his model was immediately accepted. To apply to atoms with more than one electron, the Rydberg formula can be modified by replacing with or with where is constant representing a screening effect due to the inner-shell and other electrons (see Electron shell and the later discussion of the "Shell Model of the Atom" below). This was established empirically before Bohr presented his model. Shell model (heavier atoms). Bohr's original three papers in 1913 described mainly the electron configuration in lighter elements. Bohr called his electron shells, "rings" in 1913. Atomic orbitals within shells did not exist at the time of his planetary model. Bohr explains in Part 3 of his famous 1913 paper that the maximum electrons in a shell is eight, writing: "We see, further, that a ring of "n" electrons cannot rotate in a single ring round a nucleus of charge "ne" unless "n" < 8." For smaller atoms, the electron shells would be filled as follows: "rings of electrons will only join together if they contain equal numbers of electrons; and that accordingly the numbers of electrons on inner rings will only be 2, 4, 8". However, in larger atoms the innermost shell would contain eight electrons, "on the other hand, the periodic system of the elements strongly suggests that already in neon "N" = 10 an inner ring of eight electrons will occur". Bohr wrote "From the above we are led to the following possible scheme for the arrangement of the electrons in light atoms:" In Bohr's third 1913 paper Part III called "Systems Containing Several Nuclei", he says that two atoms form molecules on a symmetrical plane and he reverts to describing hydrogen. The 1913 Bohr model did not discuss higher elements in detail and John William Nicholson was one of the first to prove in 1914 that it couldn't work for lithium, but was an attractive theory for hydrogen and ionized helium. In 1921, following the work of chemists and others involved in work on the periodic table, Bohr extended the model of hydrogen to give an approximate model for heavier atoms. This gave a physical picture that reproduced many known atomic properties for the first time although these properties were proposed contemporarily with the identical work of chemist Charles Rugeley Bury Bohr's partner in research during 1914 to 1916 was Walther Kossel who corrected Bohr's work to show that electrons interacted through the outer rings, and Kossel called the rings: "shells". Irving Langmuir is credited with the first viable arrangement of electrons in shells with only two in the first shell and going up to eight in the next according to the octet rule of 1904, although Kossel had already predicted a maximum of eight per shell in 1916. Heavier atoms have more protons in the nucleus, and more electrons to cancel the charge. Bohr took from these chemists the idea that each discrete orbit could only hold a certain number of electrons. Per Kossel, after that the orbit is full, the next level would have to be used. This gives the atom a shell structure designed by Kossel, Langmuir, and Bury, in which each shell corresponds to a Bohr orbit. This model is even more approximate than the model of hydrogen, because it treats the electrons in each shell as non-interacting. But the repulsions of electrons are taken into account somewhat by the phenomenon of screening. The electrons in outer orbits do not only orbit the nucleus, but they also move around the inner electrons, so the effective charge Z that they feel is reduced by the number of the electrons in the inner orbit. For example, the lithium atom has two electrons in the lowest 1s orbit, and these orbit at "Z" = 2. Each one sees the nuclear charge of "Z" = 3 minus the screening effect of the other, which crudely reduces the nuclear charge by 1 unit. This means that the innermost electrons orbit at approximately 1/2 the Bohr radius. The outermost electron in lithium orbits at roughly the Bohr radius, since the two inner electrons reduce the nuclear charge by 2. This outer electron should be at nearly one Bohr radius from the nucleus. Because the electrons strongly repel each other, the effective charge description is very approximate; the effective charge "Z" doesn't usually come out to be an integer. The shell model was able to qualitatively explain many of the mysterious properties of atoms which became codified in the late 19th century in the periodic table of the elements. One property was the size of atoms, which could be determined approximately by measuring the viscosity of gases and density of pure crystalline solids. Atoms tend to get smaller toward the right in the periodic table, and become much larger at the next line of the table. Atoms to the right of the table tend to gain electrons, while atoms to the left tend to lose them. Every element on the last column of the table is chemically inert (noble gas). In the shell model, this phenomenon is explained by shell-filling. Successive atoms become smaller because they are filling orbits of the same size, until the orbit is full, at which point the next atom in the table has a loosely bound outer electron, causing it to expand. The first Bohr orbit is filled when it has two electrons, which explains why helium is inert. The second orbit allows eight electrons, and when it is full the atom is neon, again inert. The third orbital contains eight again, except that in the more correct Sommerfeld treatment (reproduced in modern quantum mechanics) there are extra "d" electrons. The third orbit may hold an extra 10 d electrons, but these positions are not filled until a few more orbitals from the next level are filled (filling the "n" = 3 d-orbitals produces the 10 transition elements). The irregular filling pattern is an effect of interactions between electrons, which are not taken into account in either the Bohr or Sommerfeld models and which are difficult to calculate even in the modern treatment. Moseley's law and calculation (K-alpha X-ray emission lines). Niels Bohr said in 1962: "You see actually the Rutherford work was not taken seriously. We cannot understand today, but it was not taken seriously at all. There was no mention of it any place. The great change came from Moseley." In 1913, Henry Moseley found an empirical relationship between the strongest X-ray line emitted by atoms under electron bombardment (then known as the K-alpha line), and their atomic number . Moseley's empiric formula was found to be derivable from Rydberg's formula and later Bohr's formula (Moseley actually mentions only Ernest Rutherford and Antonius Van den Broek in terms of models as these had been published before Moseley's work and Moseley's 1913 paper was published the same month as the first Bohr model paper). The two additional assumptions that [1] this X-ray line came from a transition between energy levels with quantum numbers 1 and 2, and [2], that the atomic number when used in the formula for atoms heavier than hydrogen, should be diminished by 1, to . Moseley wrote to Bohr, puzzled about his results, but Bohr was not able to help. At that time, he thought that the postulated innermost "K" shell of electrons should have at least four electrons, not the two which would have neatly explained the result. So Moseley published his results without a theoretical explanation. It was Walther Kossel in 1914 and in 1916 who explained that in the periodic table new elements would be created as electrons were added to the outer shell. In Kossel's paper, he writes: "This leads to the conclusion that the electrons, which are added further, should be put into concentric rings or shells, on each of which ... only a certain number of electrons—namely, eight in our case—should be arranged. As soon as one ring or shell is completed, a new one has to be started for the next element; the number of electrons, which are most easily accessible, and lie at the outermost periphery, increases again from element to element and, therefore, in the formation of each new shell the chemical periodicity is repeated." Later, chemist Langmuir realized that the effect was caused by charge screening, with an inner shell containing only 2 electrons. In his 1919 paper, Irving Langmuir postulated the existence of "cells" which could each only contain two electrons each, and these were arranged in "equidistant layers". In the Moseley experiment, one of the innermost electrons in the atom is knocked out, leaving a vacancy in the lowest Bohr orbit, which contains a single remaining electron. This vacancy is then filled by an electron from the next orbit, which has n=2. But the n=2 electrons see an effective charge of "Z" − 1, which is the value appropriate for the charge of the nucleus, when a single electron remains in the lowest Bohr orbit to screen the nuclear charge +"Z", and lower it by −1 (due to the electron's negative charge screening the nuclear positive charge). The energy gained by an electron dropping from the second shell to the first gives Moseley's law for K-alpha lines, formula_71 or formula_72 Here, R"v = R"E/"h" is the Rydberg constant, in terms of frequency equal to . For values of "Z" between 11 and 31 this latter relationship had been empirically derived by Moseley, in a simple (linear) plot of the square root of X-ray frequency against atomic number (however, for silver, Z = 47, the experimentally obtained screening term should be replaced by 0.4). Notwithstanding its restricted validity, Moseley's law not only established the objective meaning of atomic number, but as Bohr noted, it also did more than the Rydberg derivation to establish the validity of the Rutherford/Van den Broek/Bohr nuclear model of the atom, with atomic number (place on the periodic table) standing for whole units of nuclear charge. Van den Broek had published his model in January 1913 showing the periodic table was arranged according to charge while Bohr's atomic model was not published until July 1913. The K-alpha line of Moseley's time is now known to be a pair of close lines, written as (Kα1 and Kα2) in Siegbahn notation. Shortcomings. The Bohr model gives an incorrect value for the ground state orbital angular momentum: The angular momentum in the true ground state is known to be zero from experiment. Although mental pictures fail somewhat at these levels of scale, an electron in the lowest modern "orbital" with no orbital momentum, may be thought of as not to revolve "around" the nucleus at all, but merely to go tightly around it in an ellipse with zero area (this may be pictured as "back and forth", without striking or interacting with the nucleus). This is only reproduced in a more sophisticated semiclassical treatment like Sommerfeld's. Still, even the most sophisticated semiclassical model fails to explain the fact that the lowest energy state is spherically symmetric – it doesn't point in any particular direction. In modern quantum mechanics, the electron in hydrogen is a spherical cloud of probability that grows denser near the nucleus. The rate-constant of probability-decay in hydrogen is equal to the inverse of the Bohr radius, but since Bohr worked with circular orbits, not zero area ellipses, the fact that these two numbers exactly agree is considered a "coincidence". (However, many such coincidental agreements are found between the semiclassical vs. full quantum mechanical treatment of the atom; these include identical energy levels in the hydrogen atom and the derivation of a fine-structure constant, which arises from the relativistic Bohr–Sommerfeld model (see below) and which happens to be equal to an entirely different concept, in full modern quantum mechanics). The Bohr model also failed to explain: Refinements. Several enhancements to the Bohr model were proposed, most notably the Sommerfeld or Bohr–Sommerfeld models, which suggested that electrons travel in elliptical orbits around a nucleus instead of the Bohr model's circular orbits. This model supplemented the quantized angular momentum condition of the Bohr model with an additional radial quantization condition, the Wilson–Sommerfeld quantization condition formula_73 where "p"r is the radial momentum canonically conjugate to the coordinate "q"r, which is the radial position, and "T" is one full orbital period. The integral is the action of action-angle coordinates. This condition, suggested by the correspondence principle, is the only one possible, since the quantum numbers are adiabatic invariants. The Bohr–Sommerfeld model was fundamentally inconsistent and led to many paradoxes. The magnetic quantum number measured the tilt of the orbital plane relative to the "xy" plane, and it could only take a few discrete values. This contradicted the obvious fact that an atom could have any orientation relative to the coordinates, without restriction. The Sommerfeld quantization can be performed in different canonical coordinates and sometimes gives different answers. The incorporation of radiation corrections was difficult, because it required finding action-angle coordinates for a combined radiation/atom system, which is difficult when the radiation is allowed to escape. The whole theory did not extend to non-integrable motions, which meant that many systems could not be treated even in principle. In the end, the model was replaced by the modern quantum-mechanical treatment of the hydrogen atom, which was first given by Wolfgang Pauli in 1925, using Heisenberg's matrix mechanics. The current picture of the hydrogen atom is based on the atomic orbitals of wave mechanics, which Erwin Schrödinger developed in 1926. However, this is not to say that the Bohr–Sommerfeld model was without its successes. Calculations based on the Bohr–Sommerfeld model were able to accurately explain a number of more complex atomic spectral effects. For example, up to first-order perturbations, the Bohr model and quantum mechanics make the same predictions for the spectral line splitting in the Stark effect. At higher-order perturbations, however, the Bohr model and quantum mechanics differ, and measurements of the Stark effect under high field strengths helped confirm the correctness of quantum mechanics over the Bohr model. The prevailing theory behind this difference lies in the shapes of the orbitals of the electrons, which vary according to the energy state of the electron. The Bohr–Sommerfeld quantization conditions lead to questions in modern mathematics. Consistent semiclassical quantization condition requires a certain type of structure on the phase space, which places topological limitations on the types of symplectic manifolds which can be quantized. In particular, the symplectic form should be the curvature form of a connection of a Hermitian line bundle, which is called a prequantization. Bohr also updated his model in 1922, assuming that certain numbers of electrons (for example, 2, 8, and 18) correspond to stable "closed shells". Model of the chemical bond. Niels Bohr proposed a model of the atom and a model of the chemical bond. According to his model for a diatomic molecule, the electrons of the atoms of the molecule form a rotating ring whose plane is perpendicular to the axis of the molecule and equidistant from the atomic nuclei. The dynamic equilibrium of the molecular system is achieved through the balance of forces between the forces of attraction of nuclei to the plane of the ring of electrons and the forces of mutual repulsion of the nuclei. The Bohr model of the chemical bond took into account the Coulomb repulsion – the electrons in the ring are at the maximum distance from each other. Symbolism of planetary atomic models. Although Bohr's atomic model was superseded by quantum models in the 1920s, the visual image of electrons orbiting a nucleus has remained the popular concept of atoms. The concept of an atom as a tiny planetary system has been widely used as a symbol for atoms and even for "atomic" energy (even though this is more properly considered nuclear energy). Examples of its use over the past century include but are not limited to:
4832
43007828
https://en.wikipedia.org/wiki?curid=4832
Bombay Sapphire
Bombay Sapphire is a brand of gin that is distilled by the Bombay Spirits Company, a subsidiary company of Bacardi, at Laverstoke Mill in the village of Laverstoke in the English county of Hampshire. The brand was first launched in 1986 by English wine-merchant International Distillers & Vintners. In 1997 Diageo sold the brand to Bacardi. Its name originates from the gin and tonic popularised by the Royal Indian Armed Forces during the British Raj in colonial India; "Bombay" refers to the Indian city and "Sapphire" refers to the violet-blue Star of Bombay which was mined from British Ceylon (Sri Lanka), and is now on display at the Smithsonian Institution. Bombay Sapphire is marketed in a flat-sided, sapphire-coloured bottle that bears a picture of Queen Victoria on the label. The flavouring of the drink comes from a recipe of ten ingredients: almond, lemon peel, liquorice, juniper berries, orris root, angelica, coriander, cassia, cubeb, and grains of paradise. Alcohol brought in from another supplier is evaporated three times using a carterhead still, and the alcohol vapours are passed through a mesh/basket containing the ten botanicals to gain flavour and aroma. This is felt to give the gin a lighter, more floral taste compared to gins created using a copper pot still. Water from Lake Vyrnwy, a reservoir in Powys, Wales is added to bring the strength of Bombay Sapphire down to 40.0% (UK, the Nordics, several continental European markets, Canada and Australia). The 47.0% version is the standard for sale at duty-free shops in all markets. Production. Until 2013, production and bottling of Bombay Sapphire was contracted out by Bacardi to G&J Greenall in Warrington, Cheshire. However, in 2011, plans were announced to move the manufacturing process to a new facility at Laverstoke Mill in Laverstoke, Hampshire. These plans included the restoration of the former Portal's paper mill at the proposed site and the construction of a visitor centre. Planning permission was granted in February 2012, and the centre opened to the public in the autumn of 2014. The visitor centre included a new construction by Thomas Heatherwick of two glasshouses for plants used as botanicals in the production of Bombay Sapphire gin. As part of the transfer of production, two of Greenall's stills were moved from Warrington to Laverstoke. After production was shifted to Laverstoke, bottling of the drink remained contracted out by G&J Greenall, with the undiluted gin being tankered to Warrington for dilution and bottling. Bottling has subsequently been shifted to Glasgow in Scotland. Varieties. Bacardi also markets Bombay Original London Dry Gin (or Bombay Original Dry). Eight botanical ingredients are used in the production of the Original Dry variety, as opposed to the ten in Bombay Sapphire. "Wine Enthusiast" preferred it to Bombay Sapphire. In September 2011, Bombay Sapphire East was launched in test markets in New York and Las Vegas. This variety has another two botanicals, lemongrass and black peppercorns, in addition to the original ten. It is bottled at 42% and was designed to counteract the sweetness of most tonic water. A special edition of Bombay gin called Star of Bombay was produced in 2015 for the UK market. It is bottled at 47.5% and is distilled from grain. It features bergamot and ambrette seeds in harmony with Bombay's signature botanicals. This version has later been extended to several other markets. Bombay Bramble is a variety infused with blackberries and raspberries and bottled at 37.5% ABV. In the summer of 2019, Bacardi launched a limited edition gin called Bombay Sapphire English Estate, which features three additional English-sourced botanicals: Pennyroyal Mint, rosehip and hazelnut. It is bottled at 41%. Design connection. The brand started a series of design collaborations. Their first step into the design world was a series of advertisements featuring work from currently popular designers. Their works, varying from martini glasses to tiles and cloth patterns, are labelled as "Inspired by Bombay Sapphire". The campaign featured designers such as Marcel Wanders, Yves Béhar, Karim Rashid, Ulla Darni, and Dror Benshetrit and performance artist Jurgen Hahn. From the success of this campaign, the company began a series of events and sponsored locations. The best known is the Bombay Sapphire Designer Glass Competition, held each year, where design students worldwide can participate by designing their own "inspired" martini cocktail glass. The finalists (one from each participating country) are then invited to the yearly Salone del Mobile, an international design fair in Milan, where the winner is chosen. Bombay Sapphire also endorses glass artists and designers with the Bombay Sapphire Prize, which is awarded yearly to an outstanding design featuring glass. Bombay Sapphire also showcases the designers' work in the Bombay Sapphire endorsed blue room, a design exhibition touring the world each year. From 2008 the Bombay Sapphire Designer Glass Competition final will be held at 100% Design in London, and the Bombay Sapphire Prize will take place in Milan at the Salone del Mobile. Evaluation. Bombay Sapphire has been reviewed by several outside spirits ratings organisations to various degrees of success. Recently, it was awarded a score of 92 (on a 100-point scale) from the Beverage Testing Institute. Ratings aggregator Proof66.com categorizes the Sapphire as a Tier 2 spirit, indicating highly favourable "expert" reviews.
4833
16185737
https://en.wikipedia.org/wiki?curid=4833
Bob Wills
James Robert "Bob" Wills (March 6, 1905 – May 13, 1975) was an American musician, songwriter, and bandleader. Considered by music authorities as the founder of Western swing, he was known widely as the King of Western Swing (although Spade Cooley self-promoted the moniker "King of Western Swing" from 1942 to 1969). He was also noted for punctuating his music with his trademark "ah-haa" calls. Wills formed several bands and played radio stations around the South and West until he formed the Texas Playboys in 1934 with Wills on fiddle, Tommy Duncan on piano and vocals, rhythm guitarist June Whalin, tenor banjoist Johnnie Lee Wills, and Kermit Whalin who played steel guitar and bass. Oklahoma guitar player Eldon Shamblin joined the band in 1937 bringing jazzy influence and arrangements. The band played regularly on Tulsa, Oklahoma, radio station KVOO and added Leon McAuliffe on steel guitar, pianist Al Stricklin, drummer Smokey Dacus, and a horn section that expanded the band's sound. Wills favored jazz-like arrangements and the band found national popularity into the 1940s with such hits as "Steel Guitar Rag", "San Antonio Rose", "Smoke on the Water", "Stars and Stripes on Iwo Jima", and "New Spanish Two Step". Wills and the Texas Playboys recorded with several publishers and companies, including Vocalion, Okeh, Columbia, and MGM. In 1950, Wills had two top 10 hits, "Ida Red likes the Boogie" and "Faded Love", which were his last hits for a decade. Throughout the 1950s, he struggled with poor health and tenuous finances. He continued to perform frequently despite a decline in the popularity of his earlier hit songs, and the growing popularity of rock and roll. Wills had a heart attack in 1962, and a second one the next year, which forced him to disband the Texas Playboys. Wills continued to perform solo. The Country Music Hall of Fame inducted Wills in 1968 and the Texas State Legislature honored him for his contribution to American music. In 1972, Wills accepted a citation from the American Society of Composers, Authors and Publishers in Nashville. He recorded an album with fan Merle Haggard in 1973. Wills suffered two strokes that left him partially paralyzed, and unable to communicate. He was comatose the last two months of his life, and died in a Fort Worth nursing home from pneumonia in 1975. The Rock and Roll Hall of Fame inducted Wills and the Texas Playboys in 1999. Biography. Early years. He was born on a cotton farm in Kosse, Texas, to Emma Lee Foley and John Tompkins Wills. His parents were both of primarily English ancestry, but had distant Irish ancestry, as well. The entire Wills family was musically inclined. His father was a statewide champion fiddle player, and several of his siblings played musical instruments. The family frequently held country dances in their home, and while living in Hall County, Texas, they also played at "ranch dances", which were popular throughout West Texas. In this environment, Wills learned to play the fiddle and the mandolin early. Wills not only learned traditional music from his family, but he also learned some blues songs directly from African-American families who worked in the cotton fields near Lakeview, Texas. As a child, he mainly interacted with African-American children, learning their musical styles and dances such as jigs. Aside from his own family, he knew few other white children until he was seven or eight years old. New Mexico and Texas. The family moved to Hall County in the Texas Panhandle in 1913, and in 1919 they bought a farm between the towns of Lakeview, Texas, and Turkey, Texas. At the age of 16, Wills left the family and hopped a freight train, travelling under the name Jim Rob. He drifted from town to town trying to earn a living for several years, once nearly falling from a moving train. In his 20s, he attended barber school, married his first wife Edna, and moved first to Roy, New Mexico, then returned to Turkey in Hall County (now considered his home town) to work as a barber at Ham's Barber Shop. He alternated barbering and fiddling, even when he moved to Fort Worth, Texas, after leaving Hall County in 1929. There, he played in minstrel and medicine shows, and, as with other Texas musicians such as Ocie Stockard, continued to earn money as a barber. He wore blackface makeup to appear in comedy routines, something that was common at the time. Wills played the violin and sang, and had two guitarists and a banjo player with him. "Bob was in blackface and was the comic; he cracked jokes, sang, and did an amazing jig dance." Since there was already a Jim on the show, the manager began calling him Bob. As Jim Rob Wills, though, paired with Herman Arnspiger, he made his first commercial (though unissued) recordings in November 1929 for Brunswick/Vocalion. Wills quickly became known for being talkative on the bandstand, a tendency he picked up from family, local cowboys, and the style of Black musicians he had heard growing up. While in Fort Worth, Wills added the "rowdy city blues" of Bessie Smith and Emmett Miller, whom he idolized, to a repertoire of mainly waltzes and breakdowns he had learned from his father, and patterned his vocal style after that of Miller and other performers such as Al Bernard. His 1935 version of "St. Louis Blues" replicates Al Bernard's patter from the 1928 version of the song. He described his love of Bessie Smith's music with an anecdote: "I rode horseback from the place between the rivers to Childress to see Bessie Smith... She was about the greatest thing I had ever heard. In fact, there was no doubt about it. She was the greatest thing I ever heard." In Fort Worth, Wills met Herman Arnspiger and formed the Wills Fiddle Band. In 1930, Milton Brown joined the group as lead vocalist and brought a sense of innovation and experimentation to the band, which became known as the Aladdin Laddies and then soon renamed itself the Light Crust Doughboys because of radio sponsorship by the makers of Light Crust Flour. Brown left the band in 1932 to form the Musical Brownies, the first true Western swing band. Brown added twin fiddles, tenor banjo, and slap bass, pointing the music in the direction of swing, which they played on local radio and at dancehalls. The Texas Playboys. After forming a new band, The Playboys, and relocating to Waco, Texas, Wills found enough popularity there to decide on a bigger market. They left Waco in January 1934 for Oklahoma City. Wills soon settled the renamed Texas Playboys in Tulsa, Oklahoma, and began broadcasting noon shows over the 50,000-watt KVOO radio station, from the stage of Cain's Ballroom. They also played dances in the evenings. Wills largely sang blues and sentimental ballads. "Lone Star Rag", "Rat Cheese Under the Hill", "Take Me Back to Tulsa", "Basin Street Blues", "Steel Guitar Rag", and "Trouble in Mind" were some of the songs in the extensive repertory played by Wills and the Playboys. Wills added a trumpet to the band inadvertently when he hired Everet Stover as an announcer, not knowing that he had played with the New Orleans symphony and had directed the governor's band in Austin. Stover, thinking he had been hired as a trumpeter, began playing with the band, and Wills never stopped him. Although Wills initially disapproved of it, young saxophonist Zeb McNally was eventually hired. Wills hired the young, "modern-style musician" Smoky Dacus as a drummer to balance out the horns. He continued to expand the lineup through the mid- to late 1930s. The addition of steel guitar whiz Leon McAuliffe in March 1935 added not only a formidable instrumentalist, but also a second engaging vocalist. Wills and the Texas Playboys did their first recordings on September 23–25, 1935, in Dallas. Session rosters from 1938 show both lead guitar and electric guitar in addition to guitar and steel guitar in the Texas Playboys recordings. About this time, Wills purchased and performed with an antique Guadagnini violin. The instrument, worth an estimated $7,600 at the time, was purchased for only $1,600. In 1940, "New San Antonio Rose" sold a million records and became the signature song of the Texas Playboys. The "front line" of Wills' orchestra consisted of either fiddles or guitars after 1944. Film career. In 1940, Wills, along with the Texas Playboys, co-starred with Tex Ritter in "Take Me Back to Oklahoma". Altogether, Wills appeared in 19 films, including "The Lone Prairie" (1942), "Riders of the Northwest Mounted" (1943), "Saddles and Sagebrush" (1943), "The Vigilantes Ride" (1943), "The Last Horseman" (1944), "Rhythm Round-Up" (1945), "Blazing the Western Trail" (1945), and "Lawless Empire" (1945). Swing era. In December 1942, after several band members had left the group, and as World War II raged, Wills joined the Army at the age of 37, but received a medical discharge in 1943. After leaving the Army, Wills moved to Hollywood and began to reorganize the Texas Playboys. He became an enormous draw in Los Angeles, where many of his fans had relocated during the Great Depression and World War II in search of jobs. Monday through Friday, the band played the noon hour timeslot over KMTR-AM (now KLAC) in Los Angeles. They also played regularly at the Mission Beach Ballroom in San Diego. He commanded enormous fees playing dances there, and began to make more creative use of electric guitars to replace the big horn sections the Tulsa band had boasted. For a very brief period in 1944, the Wills band included 23 members, and around mid-year, he toured Northern California and the Pacific Northwest with 21 pieces in the orchestra. "Billboard" reported that Wills out-grossed Harry James, Benny Goodman, "both Dorsey brothers bands, et al." at Civic Auditorium in Oakland, California, in January 1944. Wills and His Texas Playboys began their first cross-country tour in November 1944, and appeared at the Grand Ole Opry on December 30, 1944. According to Opry policy, drums and horns were considered pop instruments, inappropriate to country music. The Opry had two Western swing bands on its roster, led by Pee Wee King and Paul Howard. Neither was allowed to use their drummers at the Opry. Wills' band at the time consisted of two fiddlers, two bass fiddles, two electric guitars, electric steel guitar, and a trumpet. Wills's then-drummer was Monte Mountjoy, who played in the Dixieland style. Wills battled Opry officials and refused to perform without his drummer. An attempt to compromise by keeping Mountjoy behind a curtain collapsed when Wills had his drums placed front and center onstage at the last minute. In 1945, Wills' dances were drawing larger crowds than dances put on by Tommy Dorsey and Benny Goodman. That year, he lived in both Santa Monica and Fresno, California. In 1947, he opened the Wills Point nightclub in Sacramento, California, and continued touring the Southwest and Pacific Northwest from Texas to Washington. In Sacramento, he broadcast shows over KFBK, a station whose reach encompassed much of the American West. Wills was in such high demand that venues would book him even on weeknights, because they knew the show would still be a draw. During the postwar period, KGO radio in San Francisco syndicated a Bob Wills and His Texas Playboys show recorded at the Fairmont Hotel. Many of these recordings survive today as the Tiffany Transcriptions and are available on CD. They show off the band's strengths significantly, in part because the group was not confined to the three-minute limits of 78 RPM discs. On April 3, 1948, Wills and the Texas Playboys appeared for the inaugural broadcast of the "Louisiana Hayride" on KWKH, broadcasting from the Municipal Auditorium in Shreveport, Louisiana. Wills and the Texas Playboys played dances throughout the West to more than 10,000 people every week. They held dance attendance records at Jantzen Beach in Portland, Oregon; Santa Monica, California; Klamath Falls, Oregon; and California's Oakland Auditorium, where they drew 19,000 people over two nights. Wills recalled the early days of what became known as Western swing music in a 1949 interview: "Here's the way I figure it. We sure not tryin' to take credit for swingin' it." Still a binge drinker, Wills became increasingly unreliable in the late 1940s, causing a rift with Tommy Duncan (who bore the brunt of audience anger when Wills's binges prevented him from appearing). It ended when he fired Duncan in the fall of 1948. Later years. Having lived a lavish lifestyle in California, Wills moved back to Oklahoma City in 1949, then went back on the road to maintain his payroll and Wills Point. He opened a second club, the Bob Wills Ranch House, in Dallas, Texas. Turning the club over to managers, later revealed to be dishonest, left Wills in desperate financial straits with heavy debts to the IRS for back taxes. This caused him to sell many assets, including the rights to "New San Antonio Rose". In 1950, Wills had two top-10 hits, "Ida Red Likes the Boogie" and "Faded Love". After 1950, radio stations began to increasingly specialize in one form or another of commercially popular music. Although usually labelled "country and western", Wills did not fit into the style played on popular country and western stations, which typically played music in the Nashville sound. Neither did he fit into the conventional sound of pop stations, although he played a good deal of pop music. Wills continued to appear at the Bostonia Ballroom in San Diego throughout the 1950s. He continued to tour and record through the 1950s into the early 1960s despite the fact that Western swing's popularity, even in the Southwest, had greatly diminished. Charles R. Townsend described his drop in popularity: Bob could draw "a thousand people on Monday night between 1950 and 1952, but he could not do that by 1956. Entertainment habits had changed." On Wills' return to Tulsa late in 1957, Jim Downing of the "Tulsa Tribune" wrote an article headlined "Wills Brothers Together Again: Bob Back with Heavy Beat". The article quotes Wills as saying "Rock and roll? Why, man, that's the same kind of music we've been playin' since 1928! ... We didn't call it rock and roll back when we introduced it as our style back in 1928, and we don't call it rock and roll the way we play it now. But it's just basic rhythm and has gone by a lot of different names in my time. It's the same, whether you just follow a drum beat like in Africa or surround it with a lot of instruments. The rhythm's what's important." The use of amplified guitars accentuates Wills's claim; some Bob Wills recordings from the 1930s and 1940s sound similar to rock and roll records of the 1950s. Even a 1958 return to KVOO, where his younger brother Johnnie Lee Wills had maintained the family's presence, did not produce the success he hoped. He appeared twice on ABC-TV's "Jubilee USA" and kept the band on the road into the 1960s. After two heart attacks, in 1965, he dissolved the Texas Playboys (who briefly continued as an independent unit) to perform solo with house bands. While he did well in Las Vegas and other areas, and made records for the Kapp Records label, he was largely a forgotten figure—even though inducted into the Country Music Hall of Fame in 1968. A 1969 stroke left his right side paralyzed, ending his active career. He did, however, recover sufficiently to appear in a wheelchair at various Wills tributes held in the early 1970s. A revival of interest in his music, spurred by Merle Haggard's 1970 album "A Tribute to the Best Damn Fiddle Player in the World", led to a 1973 reunion album, teaming Wills, who spoke with difficulty, with key members of the early band, as well as Haggard. Wills died in Fort Worth of pneumonia on May 13, 1975. Personal life. Bob Wills was married six times and divorced five times. He was twice married to, and twice divorced from, Mary Helen Brown, the widow of Wills' ex-band member Milton Brown. Legacy. Wills' style influenced performers Buck Owens, Merle Haggard, and The Strangers and helped to spawn a style of music now known as the Bakersfield Sound. (Bakersfield, California, was one of Wills' regular stops in his heyday). A 1970 tribute album by Haggard, "A Tribute to the Best Damn Fiddle Player in the World (or, My Salute to Bob Wills)" directed a wider audience to Wills's music, as did the appearance of younger "revival" bands like Asleep at the Wheel and Commander Cody and His Lost Planet Airmen plus the growing popularity of longtime Wills disciple and fan Willie Nelson. By 1971, Wills recovered sufficiently to travel occasionally and appear at tribute concerts. In 1973, he participated in a final reunion session with members of some of the Texas Playboys from the 1930s to the 1960s. Merle Haggard was invited to play at this reunion. The session, scheduled for two days, took place in December 1973, with the album to be titled "For the Last Time". Wills, speaking or attempting to holler, appeared on a couple tracks from the first day's session but suffered a stroke overnight. He had a more severe one a few days later. The musicians completed the album without him. Wills by then was comatose. He lingered until his death on May 13, 1975. Reviewing "For the Last Time" in "" (1981), Robert Christgau wrote: "This double-LP doesn't represent the band at its peak. But though earlier recordings of most of these classic tunes are at least marginally sharper, it certainly captures the relaxed, playful, eclectic Western swing groove that Wills invited in the '30s." In addition to being inducted into the Country Music Hall of Fame in 1968, Wills was inducted into the Nashville Songwriters Hall of Fame in 1970, the Rock and Roll Hall of Fame in the Early Influence category along with the Texas Playboys in 1999, and received the Grammy Lifetime Achievement Award in 2007. From 1974 until his 2002 death, Waylon Jennings performed a song he had written called "Bob Wills Is Still the King". Released as the B-side of a single that was a double-sided hit, it went to number one on the country charts. The song has become a staple of classic country radio station formats. In addition, The Rolling Stones performed this song live in Austin, Texas, at Zilker Park on their A Bigger Bang Tour, a shout-out to Wills. This performance was included on their subsequent DVD "The Biggest Bang". In a 1968 issue of "Guitar Player", rock guitarist Jimi Hendrix said of Wills and the Playboys: "I dig them. The Grand Ole Opry used to come on, and I used to watch that. They used to have some pretty heavy cats, some heavy guitar players." In fact, Bob Wills and His Texas Playboys only performed on the Opry twice: in 1944 and 1948. Hendrix almost surely referred to Nashville guitarists. Wills ranked number 27 in "CMT's 40 Greatest Men in Country Music" in 2003. Wills' upbeat 1938 song "Ida Red" was Chuck Berry's primary inspiration for creating his first rock-and-roll hit "Maybellene". Fats Domino once remarked that he patterned his 1960 rhythm section after that of Bob Wills. During the 49th Grammy Awards in 2007, Carrie Underwood performed his song "San Antonio Rose". Today, George Strait performs Wills' music on concert tours and records songs influenced by Wills and his Texas-style swing. The Austin-based Western swing band Asleep at the Wheel has honored Wills' music since the band's inception, mostly notably with their continuing performances of the musical drama "", which debuted in Austin in March 2005 to coincide with celebrations of Wills' 100th birthday. The Bob Wills Birthday Celebration is held every year in March at the Cain's Ballroom in Tulsa, Oklahoma, with a Western swing concert and dance. In 2004, a documentary film about his life and music, titled "Fiddlin' Man: The Life and Music of Bob Wills", was released by VIEW Inc. In 2011, Proper Records released an album by Hot Club of Cowtown titled "What Makes Bob Holler: A Tribute to Bob Wills and His Texas Playboys" and the Texas Legislature adopted a resolution designating Western swing as the official State Music of Texas. The Greenville Chamber of Commerce hosts an annual Bob Wills Fiddle Festival and Contest in downtown Greenville, Texas, in November. Bob Wills was honored in episode two of Ken Burns' 2019 series on PBS called "Country Music". In 2021, Wills was inducted into the Texas Cowboy Hall of Fame.
4834
48973418
https://en.wikipedia.org/wiki?curid=4834
Badtrans
BadTrans is a malicious Microsoft Windows computer worm distributed by e-mail. Because of a known vulnerability in older versions of Internet Explorer (CVE-2001-0154), some email programs, such as Microsoft's Outlook Express and Microsoft Outlook programs, may install and execute the worm as soon as the e-mail message is viewed. Once executed, the worm replicates by sending copies of itself to other e-mail addresses found on the host's machine, and installs a keystroke logger, which then captures everything typed on the affected computer. Badtrans then transmits the data to one of several e-mail addresses. Among the e-mail addresses that received the keyloggers were free addresses at Excite, Yahoo, and IJustGotFired.com. The target address at IJustGotFired began receiving emails at 3:23pm on November 24, 2001. Once the account exceeded its quotas, it was automatically disabled, but the messages were still saved as they arrived. The address received over 100,000 keylogs in the first day alone. In mid-December, the FBI contacted Rudy Rucker, Jr., owner of MonkeyBrains, and requested a copy of the keylogged data. All of that data was stolen from the victims of the worm; it includes no information about the creator of Badtrans. Instead of complying with the FBI request, MonkeyBrains published a database website, https://web.archive.org/web/20070621140432/https://badtrans.monkeybrains.net/ for the public to determine if a given address has been compromised. The database does not reveal the actual passwords or keylogged data.
4836
49909580
https://en.wikipedia.org/wiki?curid=4836
Barış Manço
Mehmet Barış Manço (born Tosun Yusuf Mehmet Barış Manço; 2 January 1943 – 1 February 1999), better known by his stage name Barış Manço, was a Turkish rock musician, singer, composer, actor, television producer and show host. Beginning his musical career while attending Galatasaray High School, he was a pioneer of rock music in Turkey and one of the founders of the Anatolian rock genre. Manço composed around 200 songs and is among the best-selling Turkish artists to date and the winner of the most awards. Many of his songs were translated into other languages including English, French, Japanese, Greek, Italian, Bulgarian, Romanian, Persian, Hebrew, Urdu, Arabic, and German. Through his TV programme, "7'den 77'ye" ("From 7 to 77"), Manço travelled the world and visited many countries. He remains one of Turkey's most popular public figures long after his death. Early life and career. Barış Manço was born in Üsküdar, Istanbul, Turkey on 2 January 1943. Born in Adana, his mother Rikkat Uyanık, was a famous singer in the early 1940s. His older brother, who was born during World War II, was named Savaş ("War" in Turkish) while he was named Barış ("Peace" in Turkish) by his parents to celebrate the end of the war. At birth, he was additionally named Tosun Yusuf after his deceased uncle Yusuf nicknamed Tosun (literally: Joseph the Sturdy). However, this name was erased just before he attended primary school. During his time at Galatasaray High School (and later in Şişli Terakki High School) in the 1950s he formed his first band, Kafadarlar (The Buddies), allegedly after seeing Erkin Koray's band - all students at the nearby Deutsche Schule Istanbul ("İstanbul Alman Lisesi") - performing. Prof. Dr. Asaf Savaş Akat, a famous economist in Turkey, played saxophone, while guitarist Ender Enön made his own guitar because it was difficult to find a real one on the market in those years. In 1962 and 1963, with his next band, Harmoniler (The Harmonies), he recorded cover versions of some popular American twist songs and rearrangements of Turkish folk songs in rock and roll form, thus marking the beginning of the Anatolian rock movement, a synthesis of Turkish folk music and rock. In this period, his key visual and musical influence was Elvis Presley. After graduating from high school in 1963, he moved to Europe, traveling to Paris and Liège where he formed bands with local musicians and recorded some singles mainly in English and French but also in Turkish. Then, in 1964, he resumed his studies at the Royal Academy of Fine Arts in Liège, Belgium. He toured with his band Les Mistigris (not related to Mistigris) in Germany, Belgium, France and Turkey until 1967. In 1967, he suffered a serious car accident, after which he started to grow his signature moustache to conceal his scar. Frustrated by the difficulties of working with musicians from different nationalities, he formed Kaygısızlar (The Carefrees), featuring Mazhar Alanson and Fuat Güner, future members of the band MFÖ. He recorded several singles and toured with the band, both domestically and internationally, until the band members protested that they did not want to live abroad. In 1970, he formed Barış Manço Ve... (Barış Manço and...) again with foreign musicians, to record his first hit single, "Dağlar Dağlar" (Mountains, Mountains!), which was a success in both Turkey and Belgium, selling over 700,000 copies. It remains one of his most popular songs. 1970s. After the success of "Dağlar Dağlar", Manço recorded a couple of singles with Moğollar (The Mongols), another influential Turkish Anatolian rock band. He then decided to return to Turkey where he recorded with the reformed Kaygısızlar for a short period. In 1971, his early works were compiled under his first full-length album "Dünden Bugüne ("From Yesterday to Today")", today commonly referred to as "Dağlar Dağlar". In 1972, he formed the legendary Kurtalan Ekspres that would accompany him until his death. While continuing to release singles, in 1975 he also released his first non-compilation LP "2023", a concept album that included many instrumental pieces. In a last attempt to achieve international success, he released an LP simply entitled "Baris Mancho" (1976), a strange transcription of his name. It was mostly completed with the George Hayes Orchestra on the CBS Records label in Europe and South Africa. Although the album did not bring him the fame he was hoping for, it did top the charts in Romania and Morocco. The following year, the album was released in Turkey under the title "Nick the Chopper". In 1975 he starred in the movie "Baba Bizi Eversene" (Father Make Us Marry), the only movie he starred in during his career. Its music is a compilation of tracks composed by Barış Manço and Kurtalan Ekspres. From 1977 to 1980, he released three more albums in Turkey, partly consisting of compilations of older singles, namely "Sakla Samanı Gelir Zamanı" (1977), "Yeni Bir Gün" (A New Day, 1979) and "20. Sanat Yılı Disko Manço" (1980), all following a similar sound to "2023". All these albums are now rarities, but most of the material is available in later compilations "Ben Bilirim" and "Sarı Çizmeli Mehmet Ağa". 1980s. In 1981, Manço released "Sözüm Meclisten Dışarı" with Kurtalan Ekspres. It contained many hit songs including "Alla Beni Pulla Beni", "Arkadaşım Eşek (My Friend the Donkey", "Gülpembe (Pink Rose)", "Halhal" and "Dönence" among others. The album remains one of their most popular works and boosted Manço's popularity during the 1980s. "Arkadaşım Eşşek" quickly grew very popular with children (the song is about rural nostalgia and was not initially intended as a children's song) and he went on to write many other songs primarily for children, becoming an icon among Turkish children in the 1980s and 1990s. On the other hand, "Gülpembe", a requiem for Manço's grandmother composed by Kurtalan Ekspres bassist Ahmet Güvenç, attracted older listeners and is probably Manço's most popular song, "Dağlar Dağlar" being its only real competition. In 1983, "Estağfurullah, Ne Haddimize" was released. It contained the hit songs "Halil İbrahim Sofrası (Halil İbrahim's Dinner Table)" and "Kol Düğmeleri" (Sleeve Buttons), a new version of the artist's first song. "Halil İbrahim Sofrası" exemplified Manço's typically moralistic lyrics, a rare feature in Turkish pop music. In 1985, "24 Ayar Manço" (24 Carat Manço) which included "Gibi Gibi" and a long conceptual song "Lahburger" was released. It marked the beginning of a shift in Manço's sound characterised by the heavy use of synthesisers and drum machines in contrast with older works made up of a group-oriented, rock-based sound. In subsequent years, Manço released "Değmesin Yağlı Boya" (1986, A Touch of Oil Paint), "Sahibinden İhtiyaçtan" (1988) and "Darısı Başınıza" (1989), all containing a couple of hit songs and demonstrating his new sound. "7'den 77'ye" and 1990s. In 1988, "7'den 77'ye" ("From 7 to 77"), a TV show directed and presented by Manço, began to run on TRT 1, the Turkish state television channel. It was a combined music, talk show, and documentary programme which was a major hit during the eight years it was on air. Manço traveled to almost 150 countries for the show. "Adam Olacak Çocuk (The Child Will Become A Man)", a section of the show dedicated to children, strengthened Manço's popularity among younger audiences. Although his fame continued in the 1990s thanks to the success of his TV show, which was popular with all age groups, his music in this period was not as successful as it had been in previous decades. The albums "Mega Manço" (1992, Great Manço) and "Müsadenizle Çocuklar" (1995) were considered the weakest offerings of his career despite the limited success of the 1992 children's hit "Ayı" (The Bear). On the other hand, in 1995 he toured in Japan with Kurtalan Ekspres, which leding to "Live In Japan" (1996), his only live album. He released two albums in that country and gained recognition as "the man who writes songs about vegetables", referring to "Domates, Biber, Patlıcan" ("Tomato, Pepper, Aubergine") and "Nane, Limon Kabuğu" (Mint, Lemon Rind), two of his hit songs from the 1980s. Death. On 1 February 1999, Barış Manço died at his Moda house after suffering a sudden heart attack before the release of his newly completed last work "Mançoloji" ("Mançology" or "Manchology") (1999), a double album containing new recordings of his hit songs along with an unfinished instrumental song "40. Yıl" (The 40th Year), celebrating his 40th year in music. His sudden death caused almost unanimous shock in Turkey with millions of people mourning and tens of thousands of people attending his funeral. He was interred in Kanlıca Cemetery in Istanbul. Legacy. Barış Manço was one of Turkey's most influential pop musicians. In his early career he and his different bands contributed to the Turkish rock movement by combining traditional Turkish folk music with rock influences, which is still one of the main characteristics of many of modern Turkish pop music. His image, featuring long hair, a moustache and several outsized rings, moderated the reaction of otherwise conservative Turkish public opinion. Manço pioneered the progressive rock-influenced Anatolian rock movement of the 1970s and his experimentation with electronic instruments a decade later contributed to the sound of Turkish popular music in the 1990s. His lyrics, covering diverse themes but mostly presenting a somewhat modernised version of the "aşık" (wandering folk poets) tradition, were marginal to the pop music scene of the 1980s which was mostly dominated by love songs. In 2002, a tribute album was released under the name "Yüreğimdeki Barış Şarkıları" ("Songs of Barış (Peace) In My Heart"), featuring fifteen popular Turkish artists from such diverse genres as arabesque, and pop and rock (both Anatolian and western style), demonstrating his wide range of influences. On 2 January 2013, Google Doodle celebrated Barış Manço's 70th birthday. Discography. Singles. With Harmoniler With Jacques Denjean Orchestra With Les Mistigris With Kaygısızlar With Barış Manço Ve With Moğollar With Moğollar / Kaygısızlar With Kaygısızlar / Les Mistigris With Kurtalan Ekspres With George Hayes Orchestra / Kurtalan Ekspres With Kurtalan Ekspres
4840
122189
https://en.wikipedia.org/wiki?curid=4840
Blitz BASIC
Blitz BASIC is the programming language dialect of the first Blitz compilers, devised by New Zealand–based developer Mark Sibly. Being derived from BASIC, Blitz syntax was designed to be easy to pick up for beginners first learning to program. The languages are game-programming oriented, but are often found general-purpose enough to be used for most types of application. The Blitz language evolved as new products were released, with recent incarnations offering support for more advanced programming techniques such as object-orientation and multithreading. This led to the languages losing their BASIC moniker in later years. History. The first iteration of the Blitz language was created for the Amiga platform and published by the Australian firm Memory and Storage Technology. Returning to New Zealand, Blitz BASIC 2 was published several years later (around 1993 according this press release ) by Acid Software, a local Amiga game publisher. Since then, Blitz compilers have been released on several platforms. Following the demise of the Amiga as a commercially viable platform, the Blitz BASIC 2 source code was released to the Amiga community. Development continues to this day under the name AmiBlitz. BlitzBasic. Idigicon published BlitzBasic for Microsoft Windows in October 2000. The language included a built-in API for performing basic 2D graphics and audio operations. Following the release of Blitz3D, BlitzBasic is often synonymously referred to as Blitz2D. Recognition of BlitzBasic increased when a limited range of "free" versions were distributed in popular UK computer magazines such as "PC Format". This resulted in a legal dispute between the developer and publisher, which was eventually resolved amicably. BlitzPlus. In February 2003, Blitz Research Ltd. released BlitzPlus also for Windows. It lacked the 3D engine of Blitz3D, but did bring new features to the 2D side of the language by implementing limited Windows control support for creating native GUIs. Backwards compatibility of the 2D engine was also extended, allowing compiled BlitzPlus games and applications to run on systems that might only have DirectX 1. BlitzMax. The first BlitzMax compiler was released in December 2004 for Mac OS X. This made it the first Blitz dialect that could be compiled on *nix platforms. Compilers for Microsoft Windows and Linux were subsequently released in May 2005. BlitzMax brought the largest change of language structure to the modern range of Blitz products by extending the type system to include object-oriented concepts and modifying the graphics API to better suit OpenGL. BlitzMax was also the first of the Blitz languages to represent strings internally using UCS-2, allowing native-support for string literals composed of non-ASCII characters. BlitzMax's platform-agnostic command-set allows developers to compile and run source code on multiple platforms. However the official compiler and build chain will only generate binaries for the platform that it is executing on. Unofficially, users have been able to get Linux and Mac OS X to cross-compile to the Windows platform. BlitzMax is also the first modular version of the Blitz languages, improving the extensibility of the command-set. In addition, all of the standard modules shipped with the compiler are open-source and so can be tweaked and recompiled by the programmer if necessary. The official BlitzMax cross-platform GUI module (known as MaxGUI) allows developers to write GUI interfaces for their applications on Linux (FLTK), Mac (Cocoa) and Windows. Various user-contributed modules extend the use of the language by wrapping such libraries as wxWidgets, Cairo, and Fontconfig as well as a selection of database modules. There are also a selection of third-party 3D modules available namely MiniB3D - an open-source OpenGL engine which can be compiled and used on all three of BlitzMax's supported platforms. In October 2007, BlitzMax 1.26 was released which included the addition of a reflection module. BlitzMax 1.32 shipped new threading and Lua scripting modules and most of the standard library functions have been updated so that they are unicode friendly. Blitz3D SDK. Blitz3D SDK is a 3D graphics engine based on the engine in Blitz3D. It was marketed for use with C++, C#, BlitzMax, and PureBasic, however it could also be used with other languages that follow compatible calling conventions. Max3D module. In 2008, the source code to Max3D – a C++-based cross-platform 3D engine – was released under a BSD license. This engine focused on OpenGL but had an abstract backend for other graphics drivers (such as DirectX) and made use of several open-source libraries, namely Assimp, Boost, and ODE. Despite the excitement in the Blitz community of Max3D being the eagerly awaited successor to Blitz3D, interest and support died off soon after the source code was released and eventually development came to a halt. There is no indication that Blitz Research will pick up the project again. Open-source release. BlitzPlus was released as open-source on 28 April 2014 under the zlib license on GitHub. Blitz3D followed soon after and was released as Open Source on 3 August 2014. BlitzMax was later released as Open Source on 21 September 2015. Reception. Blitz Basic 2.1 was well received by Amiga magazines. "CU Amiga" highlighted its ability to create AmigaOS compliant applications and games (unlike AMOS Basic) and "Amiga Shopper" called it a powerful programming language. Examples. A "Hello, World!" program that prints to the screen, waits until a key is pressed, and then terminates: Print "Hello, World!" ; Prints to the screen. WaitKey() ; Pauses execution until a key is pressed. End ; Ends Program. Program that demonstrates the declaration of variables using the three main data types (strings, integers and floats) and printing them onto the screen: name$ = "John" ; Create a string variable ($) age = 36 ; Create an integer variable (No Suffix) temperature# = 27.3 ; Create a float variable (#) print "My name is " + name$ + " and I am " + age + " years old." print "Today, the temperature is " + temperature# + " degrees." Waitkey() ; Pauses execution until a key is pressed. End ; Ends program. Program that creates a windowed application that shows the current time in binary and decimal format. See below for the BlitzMax and BlitzBasic versions: Legacy. In 2011, BRL released a cross-platform programming language called Monkey and its first official module called Mojo. Monkey has a similar syntax to BlitzMax, but instead of compiling direct to assembly code, it translates Monkey source files directly into source code for a chosen language, framework or platform e.g. Windows, Mac OS X, iOS, Android, HTML5, and Adobe Flash. Since 2015 development of Monkey X has been halted in favour of Monkey 2, an updated version of the language by Mark Sibly. The developer of Blitz BASIC, Mark Sibly, died in early December 2024.
4842
44248157
https://en.wikipedia.org/wiki?curid=4842
Bliss bibliographic classification
The Bliss bibliographic classification (BC) is a library classification system that was created by Henry E. Bliss (1870–1955) and published in four volumes between 1940 and 1953. Although originally devised in the United States, it was more commonly adopted by British libraries. A second edition of the system (BC2) has been in ongoing development in Britain since 1977. Origins. Henry E. Bliss began working on the Bliss Classification system while working at the City College of New York Library as Assistant Librarian. He was a critic of Melvil Dewey's work with the Dewey Decimal System and believed that organization of titles needed to be done with an intellectual mind frame. Being overly pragmatic or simply alphabetical, would be inadequate. In fact, Bliss is the only theorist who created an organizational scheme based on societal needs. Bliss wanted a classification system that would provide distinct rules yet still be adaptable to whatever kind of collection a library might have, as different libraries have different needs. His solution was the concept of "alternative location," in which a particular subject could be put in more than one place, as long as the library made a specific choice and used it consistently. Bliss discusses his theories and basis of organization for the Bliss Classification for the first time in his 1910 article, "A Modern Classification for Libraries, with Simple Notation, Mnemonics, and Alternatives". This publication followed his 1908 reclassification of the City College collection. His work, "Organization of Knowledge and the System of the Sciences" was published in four volumes between 1940 and 1953. The four broad underlying policies of the BC system are: Bliss deliberately avoided the use of the decimal point because of his objection to Dewey's system. Instead he used capital and lower-case letters, numerals, and every typographical symbol available on his extensive and somewhat eccentric typewriter. Single letter codes refer to broad subject areas and further letters are added to refer to increasingly specific subdisciplines. For example, at Lancaster University: Adoption and change to second edition. In 1967 the Bliss Classification Association was formed. Its first publication was the Abridged Bliss Classification (ABC), intended for school libraries. In 1977 it began to publish and maintain a revised version of Bliss's system, the Bliss Bibliographic Classification (Second Edition) or BC2. This retains only the broad outlines of Bliss's scheme, replacing most of the detailed notation with a new scheme based on the principles of faceted classification. 15 of approximately 28 volumes of schedules have so far been published. A revision of this nature has been considered by some to be a completely new system. The City College library in New York continued to use Bliss's system until 1967, when it switched to the Library of Congress system. It had become too expensive to train new staff members to use BC, and too expensive to maintain in general. Much of the Bliss stacks remain, however, as no-one has re-cataloged the books. The case was different, however, in Britain. BC proved more popular there and also spread to other English-speaking countries. Part of the reason for its success was that libraries in teachers’ colleges liked the way Bliss had organized the subject areas on teaching and education. By the mid-1950s the system was being used in at least sixty British libraries and in a hundred by the 1970s. The Bliss Classification system has been found to be successful in academic, specialty, government, and law libraries. It has also found success in libraries outside of the United States of America, as many of these libraries do not have a history of using either the Dewey Decimal, or the Library of Congress classification system. The general organizational pattern for classifying titles in the BC2 method are: Classifications. The Class Schedule in BC2 is:
4845
7903804
https://en.wikipedia.org/wiki?curid=4845
Blood alcohol content
Blood alcohol content (BAC), also called blood alcohol concentration or blood alcohol level, is a measurement of alcohol intoxication used for legal or medical purposes. BAC is expressed as mass of alcohol per volume of blood. In US and many international publications, BAC levels are written as a percentage such as 0.08%, i.e. there is 0.8 grams of alcohol per liter of blood. In different countries, the maximum permitted BAC when driving ranges from the limit of detection (zero tolerance) to 0.08% (0.8 ). BAC levels above 0.40% (4 g/L) can be potentially fatal. Units of measurement. BAC is generally defined as a fraction of weight of alcohol per volume of blood, with an SI coherent derived unit of kg/m3 or equivalently grams per liter (g/L). Countries differ in how this quantity is normally expressed. Common formats are listed in the table below. For example, the US and many international publications present BAC as a percentage, such as 0.05%. This would be interpreted as 0.05 grams per deciliter of blood. This same concentration could be expressed as 0.5‰ or 50 mg% in other countries. It is also possible to use other units. For example, in the 1930s Widmark measured alcohol and blood by mass, and thus reported his concentrations in units of g/kg or mg/g, weight alcohol per weight blood. Blood is denser than water and 1 mL of blood has a mass of approximately 1.055 grams, thus a mass-volume BAC of 1 g/L corresponds to a mass-mass BAC of 0.948 mg/g. Sweden, Denmark, Norway, Finland, Germany, and Switzerland use mass-mass concentrations in their laws, but this distinction is often skipped over in public materials, implicitly assuming that 1 L of blood weighs 1 kg. In pharmacokinetics, it is common to use the amount of substance, in moles, to quantify the dose. As the molar mass of ethanol is 46.07 g/mol, a BAC of 1 g/L is 21.706 mmol/L (21.706 mM). Effects by alcohol level. The magnitude of sensory impairment may vary in people of differing weights. The NIAAA defines the term "binge drinking" as a pattern of drinking that brings a person's blood alcohol concentration (BAC) to 0.08 grams percent or above. Estimation. Direct measurement. Blood samples for BAC analysis are typically obtained by taking a venous blood sample from the arm. A variety of methods exist for determining blood-alcohol concentration in a blood sample. Forensic laboratories typically use headspace-gas chromatography combined with mass spectrometry or flame ionization detection, as this method is accurate and efficient. Hospitals typically use enzyme multiplied immunoassay, which measures the co-enzyme NADH. This method is more subject to error but may be performed rapidly in parallel with other blood sample measurements. In Germany, BAC is determined by measuring the serum level and then converting to whole blood by dividing by the factor 1.236. This calculation underestimates BAC by 4% to 10% compared to other methods. By breathalyzer. The amount of alcohol on the breath can be measured, without requiring drawing blood, by blowing into a breathalyzer, resulting in a breath alcohol content (BrAC). The BrAC specifically correlates with the concentration of alcohol in arterial blood, satisfying the equation . Its correlation with the standard BAC found by drawing venous blood is less strong. Jurisdictions vary in the statutory conversion factor from BrAC to BAC, from 2000 to 2400. Many factors may affect the accuracy of a breathalyzer test, but they are the most common method for measuring alcohol concentrations in most jurisdictions. By intake. Blood alcohol content can be quickly estimated by a model developed by Swedish professor Erik Widmark in the 1920s. The model corresponds to a pharmacokinetic single-compartment model with instantaneous absorption and zero-order kinetics for elimination. The model is most accurate when used to estimate BAC a few hours after drinking a single dose of alcohol in a fasted state, and can be within 20% CV of the true value. It is not at all realistic for the absorption phase, and is not accurate for BAC levels below 0.2 g/L (alcohol is not eliminated as quickly as predicted) and consumption with food (overestimating the peak BAC and time to return to zero). The equation varies depending on the units and approximations used, but in its simplest form is given by: formula_1 where: A standard drink, defined by the WHO as 10 grams of pure alcohol, is the most frequently used measure in many countries. Examples: formula_2 formula_3 In terms of fluid ounces of alcohol consumed and weight in pounds, Widmark's formula can be simply approximated as formula_4 for a man or formula_5 for a woman, where EBAC and factors are given as g/dL (% BAC), such as a factor of 0.015% BAC per hour. By standard drinks. This assumes a US standard drink, i.e. or of ethanol, whereas other definitions exist, for example 10 grams of ethanol. By training. If individuals are asked to estimate their BAC, then given accurate feedback via a breathalyzer, and this procedure is repeated a number of times during a drinking session, studies show that these individuals can learn to discriminate their BAC, to within a mean error of 9 mg/100 mL (0.009% BAC). The ability is robust to different types of alcohol, different drink quantities, and drinks with unknown levels of alcohol. Trained individuals can even drink alcoholic drinks so as to adjust or maintain their BAC at a desired level. Training the ability does not appear to require any information or procedure besides breathalyzer feedback, although most studies have provided information such as intoxication symptoms at different BAC levels. Subjects continue to retain the ability one month after training. Post-mortem. After fatal accidents, it is common to check the blood alcohol levels of involved persons. However, soon after death, the body begins to putrefy, a biological process which produces ethanol. This can make it difficult to conclusively determine the blood alcohol content in autopsies, particularly in bodies recovered from water. For instance, following the 1975 Moorgate tube crash, the driver's kidneys had a blood alcohol concentration of 80 mg/100 mL, but it could not be established how much of this could be attributed to natural decomposition. Newer research has shown that vitreous (eye) fluid provides an accurate estimate of blood alcohol concentration that is less subject to the effects of decomposition or contamination. Legal limits. For purposes of law enforcement, blood alcohol content is used to define intoxication and provides a rough measure of impairment. Although the degree of impairment may vary among individuals with the same blood alcohol content, it can be measured objectively and is therefore legally useful and difficult to contest in court. Most countries forbid operation of motor vehicles and heavy machinery above prescribed levels of blood alcohol content. Operation of boats and aircraft is also regulated. Some jurisdictions also regulate bicycling under the influence. The alcohol level at which a person is considered legally impaired to drive varies by country. Test assumptions. Extrapolation. Retrograde extrapolation is the mathematical process by which someone's blood alcohol concentration at the time of driving is estimated by projecting backwards from a later chemical test. This involves estimating the absorption and elimination of alcohol in the interim between driving and testing. The rate of elimination in the average person is commonly estimated at 0.015 to 0.020 grams per deciliter per hour (g/dL/h), although again this can vary from person to person and in a given person from one moment to another. Metabolism can be affected by numerous factors, including such things as body temperature, the type of alcoholic beverage consumed, and the amount and type of food consumed. In an increasing number of states, laws have been enacted to facilitate this speculative task: the blood alcohol content at the time of driving is legally presumed to be the same as when later tested. There are usually time limits put on this presumption, commonly two or three hours, and the defendant is permitted to offer evidence to rebut this presumption. Forward extrapolation can also be attempted. If the amount of alcohol consumed is known, along with such variables as the weight and sex of the subject and period and rate of consumption, the blood alcohol level can be estimated by extrapolating forward. Although subject to the same infirmities as retrograde extrapolation—guessing based upon averages and unknown variables—this can be relevant in estimating BAC when driving and/or corroborating or contradicting the results of a later chemical test. Metabolism. The pharmacokinetics of ethanol are well characterized by the ADME acronym (absorption, distribution, metabolism, excretion). Besides the dose ingested, factors such as the person's total body water, speed of drinking, the drink's nutritional content, and the contents of the stomach all influence the profile of blood alcohol content (BAC) over time. Breath alcohol content (BrAC) and BAC have similar profile shapes, so most forensic pharmacokinetic calculations can be done with either. Relatively few studies directly compare BrAC and BAC within subjects and characterize the difference in pharmacokinetic parameters. Comparing arterial and venous BAC, arterial BAC is higher during the absorption phase and lower in the postabsorptive declining phase. Highest levels. According to Guinness World Records, the 2013 incident where a BAC of 1.374% (13.74 g/L) was recorded is the highest BAC recorded in a human who survived the ordeal.
4848
49214
https://en.wikipedia.org/wiki?curid=4848
Barrister
A barrister is a type of lawyer in common law jurisdictions. Barristers mostly specialise in courtroom advocacy and litigation. Their tasks include arguing cases in courts and tribunals, drafting legal pleadings, researching the law and giving legal opinions. Barristers are distinguished from solicitors and other types of lawyers (e.g. chartered legal executives) who have more direct access to clients, and may do transactional legal work. In some legal systems, including those of South Africa, Scandinavia, Pakistan, India, Bangladesh and the Crown Dependencies of Jersey, Guernsey and the Isle of Man, "barrister" is also regarded as an honorific. In a few jurisdictions barristers are usually forbidden from "conducting" litigation, and can only act on the instructions of another lawyer, who perform tasks such as corresponding with parties and the court, and drafting court documents. In England and Wales barristers may seek authorisation from the Bar Standards Board to conduct litigation, allowing a barrister to practise in a dual capacity. In some common law jurisdictions, such as New Zealand and some Australian states and territories, lawyers are entitled to practise both as barristers and solicitors, but it remains a separate system of qualification to practise exclusively as a barrister. In others, such as the United States, the distinction between barristers and other types of lawyers does not exist at all. Differences between barristers and other lawyers. A barrister is a lawyer who represents a litigant as an advocate before a court. A barrister speaks in court and presents the case before a judge, with or without a jury. In some jurisdictions, a barrister receives additional training in evidence law, ethics and court practice and procedure. In contrast, other legal professionals (such as solicitors) generally meet with clients, perform preparatory and administrative work, and provide legal advice. Barristers often have little or no direct contact with their clients. All correspondence, inquiries, invoices and so on will be addressed to the legal adviser, who is also primarily responsible for the barrister's fees. In England and Wales, solicitors and chartered legal executives can support barristers when in court, such as managing through large volumes of documents in the case or negotiating a settlement outside the courtroom. A barrister will usually have rights of audience in the higher courts, whereas other legal professionals will often have more limited access, or will need to acquire additional qualifications to have such access. As in common law countries in which there is a split between the roles of barrister and solicitor, the barrister in civil law jurisdictions is responsible for appearing in trials or pleading cases before the courts. Barristers usually have particular knowledge of case law, precedent, and the skills to build a case. When another legal professional is confronted with an unusual point of law, they may seek the opinion of a barrister on the issue. In most countries, barristers operate as sole practitioners and are prohibited from forming partnerships or from working as a barrister as part of a corporation. In 2009 the Clemens Report recommended the abolition of this restriction in England and Wales. However, barristers normally band together into barristers' chambers to share clerks (administrators) and operating expenses. Some chambers grow to be large and sophisticated. In some jurisdictions, barristers may be employed by firms and companies as in-house legal advisers. In court, barristers may be visibly distinguished from solicitors, chartered legal executives, and other legal practitioners by their apparel. For example, in criminal courts in Ireland, England and Wales, a barrister usually wears a horsehair wig, stiff collar, bands and a gown. Since January 2008, solicitor advocates have also been entitled to wear wigs, but wear different gowns. In many countries the traditional divisions between barristers and other legal representatives are gradually decreasing. Barristers once enjoyed a monopoly on appearances before the higher courts, but particularly within the United Kingdom this is no longer true. Solicitor-advocates and qualified chartered legal executives can generally appear on behalf of clients at trial. Increasingly, law firms are keeping even the most advanced advisory and litigation work in-house for economic and client relationship reasons. Similarly, the prohibition on barristers taking instructions directly from the public has also been widely abolished. But, in practice, direct instruction is still a rarity in most jurisdictions, partly because barristers with narrow specialisations, or who are only really trained for advocacy, are not prepared to provide general advice to members of the public. Historically, barristers have had a major role in trial preparation, including drafting pleadings and reviewing evidence. In some areas of law, that is still the case. In other areas, it is relatively common for the barrister to receive the brief from the instructing solicitor to represent a client at trial only a day or two before the proceeding. Part of the reason for this is cost. A barrister is entitled to a "brief fee" when a brief is delivered, and this represents the bulk of his or her fee in relation to any trial. They are then usually entitled to a "refresher" for each day of the trial after the first, but if a case is settled before trial, the barrister is not needed and the brief fee would be wasted. Some solicitors avoid this by delaying delivery of the brief until it is certain the case will go to trial. Justification for a split profession. Some benefits of maintaining the split include: Some disadvantages of the split include: Regulation. Barristers are regulated by the Bar for the jurisdiction where they practice, and in some countries, by the Inn of Court to which they belong. In some countries, there is external regulation. Inns of Court, where they exist, regulate admission to the profession. Inns of Court are independent societies that are responsible for the training, admission, and discipline of barristers. Where they exist, a person may only be called to the bar by an Inn, of which they must be a member. Historically, call to and success at the bar, to a large degree, depended upon social connections made early in life. A bar collectively describes all members of the profession of barrister within a given jurisdiction. While as a minimum the bar is an association embracing all its members, it is usually the case, either "de facto" or "de jure", that the bar is invested with regulatory powers over the manner in which barristers practice. Barristers around the world. In the common law tradition, the respective roles of a lawyer, as legal adviser and advocate, were formally split into two separate, regulated sub-professions. Historically, the distinction was absolute, but in the modern age, some countries that had a split legal profession now have a fused profession. In practice, the distinction in split jurisdictions may be minor, or marked. In some jurisdictions, such as Australia, Scotland and Ireland, there is little overlap. Australia. In the Australian states of New South Wales, Victoria and Queensland, there is a split profession. Nevertheless, subject to conditions, barristers can accept direct access work from clients. Each state Bar Association regulates the profession and essentially has the functions of the English Inns of Court. In the states of South Australia and Western Australia, as well as the Australian Capital Territory, the professions of barrister and solicitor are fused, but an independent bar nonetheless exists, regulated by the Legal Practice Board of the state or territory. In Tasmania and the Northern Territory, the profession is fused, although a very small number of practitioners operate as an independent bar. Generally, counsel dress in the traditional English manner (wig, gown, bar jacket and jabot) before superior courts, although this is not usually done for interlocutory applications. Wigs and robes are still worn in the Supreme Court and the District Court in civil matters and are dependent on the judicial officer's attire. Robes and wigs are worn in all criminal cases. In Western Australia, wigs are no longer worn in any court. Each year, the Bar Association appoints certain barristers of seniority and eminence to the rank of "Senior Counsel" (in most States and Territories) or "King's Counsel" (in the Northern Territory, Queensland, Victoria and South Australia). Such barristers carry the title "SC" or "KC" after their name. The appointments are made after a process of consultation with members of the profession and the judiciary. Senior Counsel appear in particularly complex or difficult cases. They make up about 14 per cent of the bar in New South Wales. Bangladesh. In Bangladesh, the law relating to barristers is the Bangladesh Legal Practitioners and Bar Council Order as administered and enforced by the Bangladesh Bar Council. The Bar Council is the supreme statutory body that regulates the legal professions in Bangladesh and ensures educational standards and regulatory compliance of advocates. Newly enrolled advocates are permitted to start practice in the district courts after admission. After two years of practice, advocates may apply to practise in the High Court Division of the Supreme Court of Bangladesh by passing the Bar Council Examination. Only advocates who are barristers in the United Kingdom may use the title of barrister. Canada. In Canada (except Quebec), the professions of barrister and solicitor are fused, and many lawyers refer to themselves with both names, even if they do not practise in both areas. In colloquial parlance within the Canadian legal profession, lawyers often term themselves as "litigators" (or "barristers"), or as "solicitors", depending on the nature of their law practice though some may in effect practise as both litigators and solicitors. However, "litigators" would generally perform all litigation functions traditionally performed by barristers and solicitors; in contrast, those terming themselves "solicitors" would generally limit themselves to legal work not involving practice before the courts (not even in a preparatory manner as performed by solicitors in England), though some might practise before chambers judges. As is the practice in many other Commonwealth jurisdictions such as Australia, Canadian litigators are gowned, but without a wig, when appearing before courts of superior jurisdiction. All law graduates from Canadian law schools, and certified internationally qualified lawyers, can apply to the relevant provincial law society for admission. A year of articling as a student supervised by a qualified lawyer and the passing of provincial bar exams are also required for an individual to be called to bar as a barrister and solicitor. The situation is somewhat different in Quebec as a result of its civil law tradition. The profession of solicitor, or , never took hold in colonial Quebec, so lawyers () have traditionally been a fused profession, arguing and preparing cases in contentious matters, whereas Quebec's other type of lawyer, civil-law notaries (), handle out-of-court non-contentious matters. However, a number of areas of non-contentious private law are not monopolised by notaries so that lawyers / avocats often specialise in handling either trials, cases, advising, or non-trial matters. The only disadvantage is that lawyers / avocats cannot draw up public instruments that have the same force of law as notarial acts. Most large law firms in Quebec offer the full range of legal services of law firms in common-law provinces. Individuals who seek to become lawyers / avocats must earn a bachelor's degree in civil law, pass the provincial bar examination, and successfully complete legal articles to be admitted to practise. Lawyers are regulated by the , constituted under Quebec law. France. In France, "avocats", or attorneys, were, until the 20th century, the equivalent of barristers. The profession included several grades ranked by seniority: "avocat-stagiaire" (trainee, who was already qualified but needed to complete two years (or more, depending on the period) of training alongside seasoned lawyers), "avocat", and "avocat honoraire" (emeritus barrister). Since the 14th century and during the course of the 19th and 20th in particular, French barristers competed in territorial battles over respective areas of legal practice against the "conseil juridique" (legal advisor, transactional solicitor) and "avoué" (procedural solicitor), and expanded to become the generalist legal practitioner, with the notable exception of "notaires" (notaries), who are ministry appointed lawyers (with a separate qualification) and who retain exclusivity over conveyancing and probate. After the 1971 and 1990 legal reforms, the "avocat" was fused with the "avoué" and the "conseil juridique", making the "avocat" (or, if female, "avocate") an all-purpose lawyer for matters of contentious jurisdiction, analogous to an American attorney. French attorneys usually do not (although they are entitled to) act both as litigators (trial lawyers) and legal consultants (advising lawyers), known respectively as "avocat plaidant" and "avocat-conseil". This distinction is however purely informal and does not correspond to any difference in qualification or admission to the role. All intending attorneys must pass an examination to be able to enrol in one of the "Centre régional de formation à la profession d'avocat (CRFPA)" (Regional centre for the training of lawyers). The "CRFPA" course has a duration of two years and is a mix between classroom teachings and internships. Its culmination is the "stage final" (final training), where the intending attorney spends six months in a law firm (generally in their favoured field of practice and in a firm in which they hope to be recruited afterwards). The intending attorney then needs to pass the "Certificat d'Aptitude à la Profession d'Avocat (CAPA)", which is the last professional examination allowing them to join a court's bar ("barreau"). It is generally recognised that the first examination is much more difficult than the CAPA and is dreaded by most law students. Each bar is regulated by a Bar Council ("Ordre du barreau"). A separate body of barristers exists called the "avocats au Conseil d'Etat et à la Cour de Cassation". Although their legal background, training and status is the same as the all-purpose avocats, these have a monopoly over litigation taken to the supreme courts, in civil, criminal or administrative matters. Germany. In Germany, no distinction between barristers and solicitors is made. Lawyers may plead at all courts except the civil branch of the Federal Court of Justice ("Bundesgerichtshof"), to which fewer than fifty lawyers are admitted. Those lawyers, who deal almost exclusively with litigation, may not plead at other courts and are usually instructed by a lawyer who represented the client in the lower courts. However, these restrictions do not apply to criminal cases, nor to pleadings at courts of the other court systems, including labour, administrative, taxation, and social courts and the European Union court system. Hong Kong. The legal profession in Hong Kong is also divided into two branches: barristers and solicitors. In the High Court of Hong Kong (including both the Court of First Instance and the Court of Appeal) and the Hong Kong Court of Final Appeal, as a general rule, only barristers and solicitor-advocates are allowed to speak on behalf of any party in open court. This means that solicitors are restricted from doing so. In these two courts, save for hearings in chambers, barristers dress in the traditional English manner, as do the judges and other lawyers. In Hong Kong, the rank of King's Counsel was granted prior to the handover of Hong Kong from the United Kingdom to China in 1997. After the handover, the rank has been replaced by Senior Counsel post-nominal letters: SC. Senior Counsel may still, however, style themselves as silks, like their British counterparts. India. In India, the law relating to barristers is the Advocates Act, 1961, which is administered and enforced by the Bar Council of India. Under the act, the council is the supreme regulatory body for the legal profession in India, ensuring the compliance of the laws and maintenance of professional standards by the legal profession in the country. The council is authorised to pass regulations and make orders in individual cases. Each state has a bar council whose function is to enrol barristers practising predominantly within that state. Each barrister must be enrolled with a single state bar council to practise in India. However, this does not restrict a barrister from appearing before any court in India. For all practical and legal purposes, the Bar Council of India retains with it, the final power to take decisions in any and all matters related to the legal profession on the whole or with respect to any There are two requirements to practise in India. First, the applicant must be a holder of a law degree from a recognised institution in India (or from one of the four recognised universities in the United Kingdom). Second, they must pass the enrolment qualifications of the bar council of the state they seek to be enrolled in. Through regulation, the Bar Council of India also ensures the standard of education required for practising in India is met with. A barrister is required to maintain certain standards of conduct and professional demeanour at all times. The Bar Council of India prescribes rules of conduct to be observed by the Barristers in the courts, while interacting with clients and in non-professional settings. Ireland. In the Republic of Ireland, admission to the Bar by the Chief Justice of Ireland is restricted to those on whom a Barrister-at-Law degree (BL) has first been conferred. The Honorable Society of King's Inns is the only educational establishment which runs vocational courses for barristers in the Republic and degrees of Barrister-at-Law can only be conferred by King's Inns. King's Inns are also the only body with the capacity to call individuals to the bar and to disbar them. Most Irish barristers choose to be governed thereafter by the Bar of Ireland, a quasi-private entity. Senior members of the profession may be selected for elevation to the Inner Bar, when they may describe themselves as Senior Counsel ("SC"). All barristers who have not been called to the Inner Bar are known as Junior Counsel (and are identified by the postnominal initials "BL"), regardless of age or experience. Admission to the Inner Bar is made by declaration before the Supreme Court, patents of precedence having been granted by the Government. Irish barristers are sole practitioners and may not form chambers or partnerships if they wish to remain members of the Bar of Ireland's Law Library. To practise under the Bar of Ireland's rules, a newly qualified barrister is apprenticed to an experienced barrister of at least seven years' experience. This apprenticeship is known as pupillage or devilling. Devilling is compulsory for those barristers who wish to be members of the Law Library and lasts for one legal year. It is common to devil for a second year in a less formal arrangement but this is not compulsory. Devils are not generally paid for their work in their devilling year. Israel In Israel, those certified to practise law hold the title of Advocate. The certification to practise law is unified, as there is no distinction between barristers and solicitors. Japan. Japan adopts a unified system. However, there are certain classes of qualified professionals who are allowed to practise in certain limited areas of law, such as scriveners ("shiho shoshi", qualified to handle title registration, deposit, and certain petite court proceedings with additional certification), tax accountants ("zeirishi", qualified to prepare tax returns, provide advice on tax computation and represent a client in administrative tax appeals) and patent agents ("benrishi", qualified to practise patent registration and represent a client in administrative patent appeals). Only the lawyers ("bengoshi") can appear before the court and are qualified to practise in any areas of law, including, but not limited to, areas that those qualified law-related professionals above are allowed to practise. Most attorneys still focus primarily on court practice and still a very small number of attorneys give sophisticated and expert legal advice on a day-to-day basis to large corporations. Netherlands. The Netherlands used to have a semi-separated legal profession comprising the lawyer and the "procureur", the latter resembling, to some extent, the profession of barrister. Under that system, lawyers were entitled to represent their clients in law, but were only able to file cases before the court at which they were registered. Cases falling under the jurisdiction of another court had to be filed by a "procureur" registered at that court, in practice often another lawyer exercising both functions. Questions were raised on the necessity of the separation, given the fact that its main purpose – the preservation of the quality of the legal profession and observance of local court rules and customs – had become obsolete. For that reason, the "procureur" as a separate profession was abolished and its functions merged with the legal profession in 2008. Currently, lawyers can file cases before any court, regardless of where they are registered. The only notable exception concerns civil cases brought before the Supreme Court, which have to be handled by lawyers registered at the Supreme Court, thus gaining from it the title "lawyer at the Supreme Court". New Zealand. In New Zealand, the professions are not formally fused but practitioners are enrolled in the High Court as "Barristers and Solicitors". They may choose, however, to practise as barristers sole. About 15% practice solely as barristers, mainly in the larger cities and usually in "chambers" (following the British terminology). They receive "instructions" from other practitioners, at least nominally. They usually conduct the proceedings in their entirety. Any lawyer may apply to become a King's Counsel (KC) to recognise the long-standing contribution to the legal profession but this status is only conferred on those practising as solicitors in exceptional circumstances. This step referred to as "being called to the inner bar" or "taking silk", is considered highly prestigious and has been a step in the career of many New Zealand judges. Unlike other jurisdictions, the term "junior barrister" is popularly used to refer to a lawyer who holds a practising certificate as a barrister, but is employed by another, more senior barrister. Generally, junior barristers are within their first five years of practice and are not yet qualified to practise as barristers sole. Barristers sole (i.e. barristers who are not employed by another barrister) who are not King's Counsel are never referred to as junior barristers. Nigeria. In Nigeria, there is no formal distinction between barristers and solicitors. All students who pass the bar examinations – offered exclusively by the Nigerian Law School – are called to the Nigerian bar, by the Body of Benchers. Lawyers may argue in any Federal trial or appellate court as well as any of the courts in Nigeria's 36 states and the Federal Capital Territory. The Legal Practitioner's Act refers to Nigerian lawyers as Legal Practitioners, and following their call to the Bar, Nigerian lawyers enter their names in the register or Roll of Legal Practitioners kept at the Supreme Court. For this reason, a Nigerian lawyer is often referred to as a Barrister and Solicitor of the Supreme Court of Nigeria, and many Nigerian lawyers term themselves Barrister-at-Law with the postnominal initials "B.L". The vast majority of Nigerian lawyers combine contentious and non-contentious work, although there is a growing tendency for practitioners in the bigger practices to specialise in one or the other. In colloquial parlance within the Nigerian legal profession, lawyers may, therefore, be referred to as "litigators" or as "solicitors". Consistent with the practice in England and elsewhere in the Commonwealth, senior members of the profession may be selected for elevation to the Inner Bar by the conferment of the rank of Senior Advocate of Nigeria (SAN). Pakistan. The profession in Pakistan is fused; an advocate works both as a barrister and a solicitor, with higher rights of audience being provided. To practise as a barrister in Pakistan, a law graduate must complete three steps; pass the Bar Practical and the Training Course (BPTC), and be called to the Provincial Bar Council, and obtain a licence to practise as an advocate in the courts of Pakistan from the relevant Bar Council, either on provincial or federal level. Poland. In Poland, there are two main types of legal professions: advocate and legal counsel. Both are regulated and these professions are restricted only for people who graduated five-year law studies, have at least three years of experience and passed five difficult national exams (civil law, criminal law, company law, administrative law and ethic) or have a doctor of law degree. Before 2015, the only difference was that advocates have a right to represent clients before the court in all cases and the legal advisors could not represent clients before the court in criminal cases. Presently, legal advisors can also represent clients in criminal cases, making the historical differences between these professions largely obsolete. South Africa. In South Africa the employment and practice of advocates (as barristers are known in South Africa) is consistent with the rest of the Commonwealth. Advocates carry the rank of Junior or Senior Counsel (SC), and are mostly briefed and paid by solicitors (known as attorneys). They are usually employed in the higher courts, particularly in the Appeal Courts where they often appear as specialist counsel. South African solicitors (attorneys) follow a practice of referring cases to Counsel for an opinion before proceeding with a case, when Counsel in question practices as a specialist in the case law at stake. Aspiring advocates currently spend one year in pupillage (formerly only six months) before being admitted to the bar in their respective provincial or judicial jurisdictions. The term "Advocate" is sometimes used in South Africa as a title, e. g. "Advocate John Doe, SC" ("Advokaat" in Afrikaans) in the same fashion as "Dr. John Doe" for a medical doctor. South Korea. In South Korea, there was no distinction between the judiciary and lawyers. Previously, a person who passed the national bar exam after two years of national education could become a judge, prosecutor, or "lawyer" based on their grades upon graduation. Due to changes brought by the implementation of an accommodated law school system, there are now two standard pathways to becoming a lawyer. Under the current legal system, lawyers who wish to become judges or prosecutors must first gain practical experience in legal practice. There is no specific limitation on what areas of law a "lawyer" may practice. Spain. Spain has a division but it does not correspond to the division in Britain between barristers/advocates and solicitors. "Procuradores" represent the litigant procedurally in court, generally under the authority of a power of attorney executed by a civil law notary, while "abogados" represent the substantive claims of the litigant through trial advocacy. "Abogados" perform both transactional work and advise in connection with court proceedings, and they have full right of audience in front of the court. The court proceeding is carried out with "abogados", not with procuradores. In essence, procuradores act as court agents operating under the instructions of an abogado. Their practice is confined to the locality of the court to which they are admitted. United Kingdom. Under European Union law, barristers, along with advocates, chartered legal executives and solicitors, are recognised as lawyers. England and Wales. Although with somewhat different laws, England and Wales are considered within the United Kingdom a single united and unified legal jurisdiction for the purposes of both civil and criminal law, alongside Scotland and Northern Ireland, the other two legal jurisdictions within the United Kingdom. England and Wales are covered by a common bar and a single law society. The profession of barrister in England and Wales is a separate profession from that of solicitor. It is, however, possible to hold the qualification of both barrister and solicitor, and/or chartered legal executive at the same time. It is not necessary to leave the bar to qualify as a solicitor. Barristers are regulated by the Bar Standards Board, a division of the General Council of the Bar. A barrister must be a member of one of the Inns of Court, which traditionally educated and regulated barristers. There are four Inns of Court: The Honourable Society of Lincoln's Inn, The Honourable Society of Gray's Inn, The Honourable Society of the Middle Temple and The Honourable Society of the Inner Temple. All are located in Central London, near the Royal Courts of Justice. They perform both scholastic and social roles, and in all cases, provide financial aid to student barristers (subject to merit) through scholarships. It is the Inns that call students to the Bar at a ceremony akin to a graduation. Social functions include dining with other members and guests and hosting other events. Law graduates wishing to work and be known as barristers must take a course of professional training (known as the "vocational component") at one of the institutions approved by the Bar Council. Until late 2020 this course was exclusively the Bar Professional Training Course, but since then the approved training offer was broadened to would-be barristers via a number of different courses, such as the new Bar Vocational Course at the Inns of Court College of Advocacy. On successful completion of the vocational component, student barristers are called to the bar by their respective inns and are elevated to the degree of barrister. However, before they can practise independently they must first undertake 12 months of pupillage. The first six months (the "non-practising period") are spent shadowing more senior practitioners, after which pupil barristers may begin to undertake some court work of their own (the "practising period"). Following successful completion of this stage, most barristers then join a set of Chambers, a group of counsel who share the costs of premises and support staff whilst remaining individually self-employed. There are approximately 17,000 barristers, of whom about ten per cent are King's Counsel and about 4,000 are "young barristers"" (i.e., under seven years' Call). In April 2023 there were 13,800 barristers in self-employed practice. In 2022 there were about 3,100 employed barristers working in companies as in-house counsel, or in local or national government, academic institutions, or the armed forces. Certain barristers in England and Wales are now instructed directly by members of the public. Members of the public may engage the services of the barrister directly within the framework of the Public Access Scheme; a solicitor is not involved at any stage. Barristers undertaking public access work can provide legal advice and representation in court in almost all areas of law and are entitled to represent clients in any court or tribunal in England and Wales. Once instructions from a client are accepted, it is the barrister who advises and guides the client through the relevant legal procedure or litigation. Before a barrister can undertake Public Access work, they must have completed a special course. At present, about one in 20 barristers has so qualified. There is also a separate scheme called "Licensed Access", available to certain nominated classes of professional client. It is not open to the general public. Public access work is experiencing a huge surge at the bar, with barristers taking advantage of the new opportunity for the bar to make profit in the face of legal aid cuts elsewhere in the profession. The ability of barristers to accept such instructions is a recent development; it results from a change in the rules set down by the General Council of the Bar in July 2004. The Public Access Scheme has been introduced as part of the drive to open up the legal system to the public and to make it easier and cheaper to obtain access to legal advice. It further reduces the distinction between solicitors and barristers. The distinction remains however because there are certain aspects of a solicitor's role that a barrister is not able to undertake. Historically a barrister might use the honorific, "Esquire". Even though the term "barrister-at-law" is sometimes seen, and was once very common, it has never been formally correct in England and Wales. Barrister is the only correct nomenclature. Barristers are expected to maintain very high standards of professional conduct. The objective of the barristers code of conduct is to avoid dominance by either the barrister or the client and the client being enabled to make informed decisions in a supportive atmosphere and, in turn, the client expects (implicitly and/or explicitly) the barrister to uphold their duties, namely by acting in the client's best interests (CD2), acting with honesty and integrity (CD3), keeping the client's affairs confidential (CD6) and working to a competent standard (CD7). These core duties (CDs) are a few, among others, that are enshrined in the BSB Handbook. Northern Ireland. In April 2003 there were 554 barristers in independent practice in Northern Ireland. 66 were King's Counsel (KCs), barristers who have earned a high reputation and are appointed by the King on the recommendation of the Lord Chancellor as senior advocates and advisers. Those barristers who are not KCs are called Junior Counsel and are styled "BL" or "Barrister-at-Law". The term "junior" is often misleading since many members of the Junior Bar are experienced barristers with considerable expertise. Benchers are, and have been for centuries, the governing bodies of the four Inns of Court in London and King's Inns, Dublin. The Benchers of the Inn of Court of Northern Ireland governed the Inn until the enactment of the Constitution of the Inn in 1983, which provides that the government of the Inn is shared between the Benchers, the Executive Council of the Inn and members of the Inn assembled in General Meeting. The Executive Council (through its Education Committee) is responsible for considering Memorials submitted by applicants for admission as students of the Inn and by Bar students of the Inn for admission to the degree of Barrister-at-Law and making recommendations to the Benchers. The final decisions on these Memorials are taken by the Benchers. The Benchers also have the exclusive power of expelling or suspending a Bar student and of disbarring a barrister or suspending a barrister from practice. The Executive Council is also involved with: education; fees of students; calling counsel to the Bar, although call to the Bar is performed by the Lord Chief Justice of Northern Ireland on the invitation of the Benchers; administration of the Bar Library (to which all practising members of the Bar belong); and liaising with corresponding bodies in other countries. Scotland. In Scotland an advocate is, in all respects except name, a barrister, but there are significant differences in professional practice. Admission to and the conduct of the profession is regulated by the Faculty of Advocates (as opposed to an Inn). Crown dependencies and UK Overseas Territories. Isle of Man, Jersey and Guernsey. In the Bailiwick of Jersey, there are solicitors (called "ecrivains") and advocates (French "avocat"). In the Bailiwicks of Jersey and Guernsey and on the Isle of Man, advocates perform the combined functions of both solicitors and barristers. Gibraltar. Gibraltar is a British Overseas Territory boasting a legal profession based on the common law. The legal profession includes both barristers and solicitors with most barristers also acting as solicitors. Admission and disciplinary matters in Gibraltar are dealt with by the Bar Council of Gibraltar and the Supreme Court of Gibraltar. For barristers or solicitors to be admitted as practising lawyers in Gibraltar they must comply with the Supreme Court Act 1930 as amended by the Supreme Court Amendment Act 2015 which requires, amongst other things, for all newly admitted lawyers as of 1 July 2015 to undertake a year's course in Gibraltar law at the University of Gibraltar. Solicitors also have right of audience in Gibraltar's courts. United States. The United States does not distinguish between lawyers as barristers and solicitors. Any American lawyer who has passed a bar examination and has been admitted to practise law in a particular US jurisdiction may prosecute or defend. The barrister–solicitor distinction existed historically in some US states, which had a separate label for barristers (called "counselors", hence the expression "attorney "and" counselor at law"). Both professions have long since been fused. Additionally, some state appellate courts require attorneys to obtain a separate certificate of admission to plead and practice in the appellate court. Federal courts require specific admission to that court's bar to practise before it. At the state appellate level and in federal courts, there is generally no separate examination process, although some US district courts require an examination on practices and procedures in their specific courts. Unless an examination is required, admission is usually granted as a matter of course to any licensed attorney in the state where the court is located. Some federal courts will grant admission to any attorney licensed in any US jurisdiction.
4849
49695963
https://en.wikipedia.org/wiki?curid=4849
Battle of Gettysburg
The Battle of Gettysburg () was a three-day battle in the American Civil War, which was fought between the Union and Confederate armies between July 1 and July 3, 1863, in and around Gettysburg, Pennsylvania. The battle, won by the Union, is widely considered the Civil War's turning point, leading to an ultimate victory of the Union and the preservation of the nation. The Battle of Gettysburg was the bloodiest battle of both the Civil War and of any battle in American military history, claiming over 50,000 combined casualties. Union Major General George Meade's Army of the Potomac defeated attacks by Confederate General Robert E. Lee's Army of Northern Virginia, halting Lee's invasion of the North and forcing his retreat. After his success in the Battle of Chancellorsville in Spotsylvania County, Virginia in May 1863, Lee led his Confederate forces through Shenandoah Valley to begin the Gettysburg Campaign, his second attempt to invade the North. With Lee's army in high spirits, he intended to shift the focus of the summer campaign from war-ravaged Northern Virginia in the hopes of penetrating as far as Harrisburg or Philadelphia, which he hoped would convince northern politicians to end the war. President Abraham Lincoln initially prodded Major General Joseph Hooker to lead his troops in pursuing Lee. But Lincoln relieved Hooker of his command just three days before the Battle of Gettysburg commenced, replacing him with Meade. On July 1, 1863, as Lee's forces moved on Gettysburg in the hopes of destroying the Union army, the two armies initially collided, and the battle commenced. Low ridges to the northwest of Gettysburg were initially defended by a Union cavalry division under Brigadier General John Buford, and soon reinforced with two corps of Union infantry. Two large Confederate corps assaulted them from the northwest and north, however, collapsing the hastily developed Union lines, leading them to retreat through the streets of Gettysburg to the hills just south of the city. On the second day of battle, on July 2, the Union line was laid out in a defensive formation resembling a fishhook. In the late afternoon, Lee launched a heavy assault on the Union's left flank, leading to fierce fighting at Little Round Top, the Wheatfield, Devil's Den, and the Peach Orchard. On the Union's right flank, Confederate demonstrations escalated into full-scale assaults on Culp's Hill and Cemetery Hill. Despite incurring significant losses, Union forces held their lines. On the third day of battle, July 3, fighting resumed on Culp's Hill, and cavalry battles raged to the east and south of Gettysburg. Pickett's Charge featured the main military engagement, a dramatic Confederate infantry assault of approximately 12,000 Confederates troops, who attacked the center of the Union line at Cemetery Ridge, which was successfully repelled by Union rifle and artillery fire, leading to great Confederate losses. The following day, on the Fourth of July, Lee led his Confederate troops on the torturous retreat from the North. Between 46,000 and 51,000 soldiers from both armies were casualties in the three-day Battle of Gettysburg, representing both the most deadly battle in the Civil War and all of U.S. history. On November 19, Lincoln traveled to Gettysburg, where he spoke at a ceremony dedicating Gettysburg National Cemetery, which honored the fallen Union soldiers and redefined the purpose of the Civil War in his famed Gettysburg Address, a 271-word address that has endured as one of the most famous speeches in American history. Background. Military situation. Shortly after the Army of Northern Virginia won a major victory over the Army of the Potomac at the Battle of Chancellorsville (April 30 – May 6, 1863), General Robert E. Lee decided upon a second invasion of the North (the first was the unsuccessful Maryland campaign of September 1862, which ended in the bloody Battle of Antietam). Such a move would upset the Union's plans for the summer campaigning season and possibly reduce the pressure on the besieged Confederate garrison at Vicksburg. The invasion would allow the Confederates to live off the bounty of the rich Northern farms while giving war-ravaged Virginia a much-needed rest. In addition, Lee's 72,000-man army could threaten Philadelphia, Baltimore, and Washington, and possibly strengthen the growing peace movement in the North. Initial movements to battle. Thus, on June 3, Lee's army began to shift northward from Fredericksburg, Virginia. Following the death of Thomas J. "Stonewall" Jackson, Lee reorganized his two large corps into three new corps, commanded by Lieutenant General James Longstreet (First Corps), Lieutenant General Richard S. Ewell (Second), and Lieutenant General A.P. Hill (Third); both Ewell and Hill, who had formerly reported to Jackson as division commanders, were new to this level of responsibility. The cavalry division remained under the command of Major General J.E.B. Stuart. The Union's Army of the Potomac, commanded by Major General Joseph Hooker, consisted of seven infantry corps, a cavalry corps, and an artillery reserve, for a combined strength of more than 100,000 men. The first major action of the campaign took place on June 9 between cavalry forces at Brandy Station, near Culpeper, Virginia. The 9,500 Confederate cavalrymen under Stuart were surprised by Major General Alfred Pleasonton's combined arms force of two cavalry divisions (8,000 troopers) and 3,000 infantry, but Stuart eventually repelled the Union attack. The inconclusive battle, the largest predominantly cavalry engagement of the war, proved for the first time that Union cavalrymen were equal to their Southern counterparts. By mid-June, the Army of Northern Virginia was poised to cross the Potomac River and enter Maryland. After defeating the Union garrisons at Winchester and Martinsburg, Ewell's Second Corps began crossing the river on June 15. Hill's and Longstreet's corps followed on June 24 and 25. Hooker's army pursued, keeping between Washington, D.C., and Lee's army. The Union army crossed the Potomac from June 25 to 27. Lee gave strict orders for his army to minimize any negative effects on the civilian population. Food, horses, and other supplies were generally not seized outright unless a citizen concealed property, although quartermasters reimbursing Northern farmers and merchants with Confederate money which was virtually worthless or with equally worthless promissory notes were not well received. Various towns, most notably York, Pennsylvania, were required to pay indemnities in lieu of supplies, under threat of destruction. During the invasion, the Confederates seized between 40 and nearly 60 northern African Americans. A few of them were escaped fugitive slaves, but many were freemen; all were sent south into slavery under guard. On June 26, elements of Major General Jubal Early's division of Ewell's corps occupied the town of Gettysburg after chasing off newly raised 26th Regiment Pennsylvania Emergency Militia Infantry in a series of minor skirmishes. Early laid the borough under tribute, but did not collect any significant supplies. Soldiers burned several railroad cars and a covered bridge, and destroyed nearby rails and telegraph lines. The following morning, Early departed for adjacent York County. Meanwhile, in a controversial move, Lee allowed Stuart to take a portion of the army's cavalry and ride around the east flank of the Union army. Lee's orders gave Stuart much latitude, and both generals share the blame for the long absence of Stuart's cavalry, as well as for the failure to assign a more active role to the cavalry left with the army. Stuart and his three best brigades were absent from the army during the crucial phase of the approach to Gettysburg and the first two days of battle. By June 29, Lee's army was strung out in an arc from Chambersburg ( northwest of Gettysburg) to Carlisle ( north of Gettysburg) to near Harrisburg and Wrightsville on the Susquehanna River. In a dispute over the use of the forces defending the Harpers Ferry garrison, Hooker offered his resignation, and Abraham Lincoln and General-in-Chief Henry W. Halleck, who were looking for an excuse to rid themselves of him, immediately accepted. They replaced Hooker early on the morning of June 28 with Major General George Gordon Meade, then commander of the V Corps. On June 29, when Lee learned that the Army of the Potomac had crossed the Potomac River, he ordered a concentration of his forces around Cashtown, located at the eastern base of South Mountain and west of Gettysburg. On June 30, while part of Hill's corps was in Cashtown, one of Hill's brigades (North Carolinians under Brigadier General J. Johnston Pettigrew) ventured toward Gettysburg. In his memoirs, Major General Henry Heth, Pettigrew's division commander, claimed that he sent Pettigrew to search for supplies in town—especially shoes. When Pettigrew's troops approached Gettysburg on June 30, they noticed Union cavalry under Major General John Buford arriving south of town, and Pettigrew returned to Cashtown without engaging them. When Pettigrew told Hill and Heth what he had seen, neither general believed that there was a substantial Union force in or near the town, suspecting that it had been only Pennsylvania militia. Despite Lee's order to avoid a general engagement until his entire army was concentrated, Hill decided to mount a significant reconnaissance in force the following morning to determine the size and strength of the enemy force in his front. Around 5a.m. on Wednesday, July 1, two brigades of Heth's division advanced to Gettysburg. Opposing forces. Union. The Army of the Potomac, initially under Hooker (Meade replaced Hooker in command on June 28), consisted of more than 100,000 men in the following organization: During the advance on Gettysburg, Reynolds was in operational command of the left, or advanced, wing of the Army, consisting of the I, III, and XI corps. Many other Union units (not part of the Army of the Potomac) were actively involved in the Gettysburg Campaign, but not directly involved in the Battle of Gettysburg. These included portions of the Union IV Corps, the militia and state troops of the Department of the Susquehanna, and various garrisons, including that at Harpers Ferry. Confederate. In reaction to the death of Jackson after Chancellorsville, Lee reorganized his Army of Northern Virginia (75,000 men) from two infantry corps into three. First day of battle. Herr Ridge, McPherson Ridge and Seminary Ridge. Anticipating that the Confederates would march on Gettysburg from the west on the morning of July 1, Buford laid out his defenses on three ridges west of the town: "Herr Ridge", "McPherson Ridge" and "Seminary Ridge". These were appropriate terrain for a delaying action by his small cavalry division against superior Confederate infantry forces, meant to buy time awaiting the arrival of Union infantrymen who could occupy the strong defensive positions south of town at Cemetery Hill, Cemetery Ridge, and Culp's Hill. Buford understood that if the Confederates could gain control of these heights, Meade's army would have difficulty dislodging them. Heth's division advanced with two brigades forward, commanded by brigadier generals James J. Archer and Joseph R. Davis. They proceeded easterly in columns along the Chambersburg Pike. west of town, about 7:30 a.m. on July 1, the two brigades met light resistance from vedettes of Union cavalry, and deployed into line. According to lore, the Union soldier to fire the first shot of the battle was Lieutenant Marcellus Jones. Eventually Heth's men encountered dismounted troopers of Colonel William Gamble's cavalry brigade. The dismounted troopers resisted stoutly, delaying the Confederate advance with most firing their breech-loading Sharp's carbines from behind fences and trees. (A small number of troopers had other carbine models. A small minority of historians have written that some troopers had Spencer repeating carbines or Spencer repeating rifles but most sources disagree.) Still, by 10:20 am, the Confederates had pushed the Union cavalrymen east to McPherson Ridge, when the vanguard of the I Corps (Major General John F. Reynolds) finally arrived. North of the pike, Davis gained a temporary success against Brigadier General Lysander Cutler's brigade but was repelled with heavy losses in an action around an unfinished railroad bed cut in the ridge. South of the pike, Archer's brigade assaulted through Herbst (also known as McPherson's) Woods. The Union Iron Brigade under Brigadier General Solomon Meredith enjoyed initial success against Archer, capturing several hundred men, including Archer himself. General Reynolds was shot and killed early in the fighting while directing troop and artillery placements just to the east of the woods. Shelby Foote wrote that the Union cause lost a man considered by many to be "the best general in the army". Major General Abner Doubleday assumed command. Fighting in the Chambersburg Pike area lasted until about 12:30 pm. It resumed around 2:30 pm, when Heth's entire division engaged, adding the brigades of Pettigrew and Colonel John M. Brockenbrough. As Pettigrew's North Carolina Brigade came on line, they flanked the 19th Indiana and drove the Iron Brigade back. The 26th North Carolina (the largest regiment in the army, with 839 men) lost heavily, leaving the first day's fight with around 212 men. By the end of the three-day battle, they had about 152 men standing, the highest casualty percentage for one battle of any regiment, North or South. Slowly the Iron Brigade was pushed out of the woods toward Seminary Ridge. Hill added Major General William Dorsey Pender's division to the assault, and the I Corps was driven back through the grounds of the Lutheran Seminary and Gettysburg streets. As the fighting to the west proceeded, two divisions of Ewell's Second Corps, marching west toward Cashtown in accordance with Lee's order for the army to concentrate in that vicinity, turned south on the Carlisle and Harrisburg roads toward Gettysburg, while the Union XI Corps (Major General Oliver O. Howard) raced north on the Baltimore Pike and Taneytown Road. By early afternoon, the Union line ran in a semicircle west, north, and northeast of Gettysburg. However, the Union did not have enough troops; Cutler, whose brigade was deployed north of the Chambersburg Pike, had his right flank in the air. The leftmost division of the XI Corps was unable to deploy in time to strengthen the line, so Doubleday was forced to throw in reserve brigades to salvage his line. Around 2:00p.m., the Confederate Second Corps divisions of major generals Robert E. Rodes and Jubal Early assaulted and out-flanked the Union I and XI corps' positions north and northwest of town. The Confederate brigades of Colonel Edward A. O'Neal and Brigadier General Alfred Iverson suffered severe losses assaulting the I Corps division of Brigadier General John C. Robinson south of Oak Hill. Early's division profited from a blunder by Brigadier General Francis C. Barlow, when he advanced his XI Corps division to Blocher's Knoll (directly north of town and now known as Barlow's Knoll); this represented a salient in the corps line, susceptible to attack from multiple sides, and Early's troops overran Barlow's division, which constituted the right flank of the Union Army's position. Barlow was wounded and captured in the attack. As Union positions collapsed both north and west of town, Howard ordered a retreat to the high ground south of town at Cemetery Hill, where he had left the division of Brigadier General Adolph von Steinwehr in reserve. Major General Winfield S. Hancock assumed command of the battlefield, sent by Meade when he heard that Reynolds had been killed. Hancock, commander of the II Corps and Meade's most trusted subordinate, was ordered to take command of the field and to determine whether Gettysburg was an appropriate place for a major battle. Hancock told Howard, "I think this the strongest position by nature upon which to fight a battle that I ever saw." When Howard agreed, Hancock concluded the discussion: "Very well, sir, I select this as the battle-field." Hancock's determination had a morale-boosting effect on the retreating Union soldiers, but he played no direct tactical role on the first day. General Lee understood the defensive potential to the Union if they held this high ground. He sent orders to Ewell that Cemetery Hill be taken "if practicable". Ewell, who had previously served under Stonewall Jackson, a general well known for issuing peremptory orders, determined such an assault was not practicable and, thus, did not attempt it; this decision is considered by historians to be a great missed opportunity. The first day at Gettysburg, more significant than simply a prelude to the bloody second and third days, ranks as the 23rd biggest battle of the war by number of troops engaged. About one quarter of Meade's army (22,000 men) and one third of Lee's army (27,000) were engaged. Second day of battle. Plans and movement to battle. Throughout the evening of July 1 and morning of July 2, most of the remaining infantry of both armies arrived on the field, including the Union II, III, V, VI, and XII Corps. Two of Longstreet's divisions were on the road: Brigadier General George Pickett, had begun the march from Chambersburg, while Brigadier General Evander M. Law had begun the march from Guilford. Both arrived late in the morning. Law completed his march in eleven hours. The Union line ran from Culp's Hill southeast of the town, northwest to Cemetery Hill just south of town, then south for nearly along Cemetery Ridge, terminating just north of Little Round Top. Most of the XII Corps was on Culp's Hill; the remnants of I and XI Corps defended Cemetery Hill; II Corps covered most of the northern half of Cemetery Ridge; and III Corps was ordered to take up a position to its flank. The shape of the Union line is popularly described as a "fishhook" formation. The Confederate line paralleled the Union line about to the west on Seminary Ridge, ran east through the town, then curved southeast to a point opposite Culp's Hill. Thus, the Union army had interior lines, while the Confederate line was nearly long. Lee's battle plan for July 2 called for a general assault of Meade's positions. On the right, Longstreet's First Corps was to position itself to attack the Union left flank, facing northeast astraddle the Emmitsburg Road, and to roll up the Union line. The attack sequence was to begin with Maj. Gens. John Bell Hood's and Lafayette McLaws's divisions, followed by Major General Richard H. Anderson's division of Hill's Third Corps. On the left, Lee instructed Ewell to position his Second Corps to attack Culp's Hill and Cemetery Hill when he heard the gunfire from Longstreet's assault, preventing Meade from shifting troops to bolster his left. Though it does not appear in either his or Lee's Official Report, Ewell claimed years later that Lee had changed the order to simultaneously attack, calling for only a "diversion", to be turned into a full-scale attack if a favorable opportunity presented itself. Lee's plan, however, was based on faulty intelligence, exacerbated by Stuart's continued absence from the battlefield. Though Lee personally reconnoitered his left during the morning, he did not visit Longstreet's position on the Confederate right. Even so, Lee rejected suggestions that Longstreet move beyond Meade's left and attack the Union flank, capturing the supply trains and effectively blocking Meade's escape route. Lee did not issue orders for the attack until 11:00 a.m. About noon, General Anderson's advancing troops were discovered by General Sickles's outpost guard and the Third Corps—upon which Longstreet's First Corps was to form—did not get into position until 1:00 pm. Hood and McLaws, after their long march, were not yet in position and did not launch their attacks until just after 4:00p.m. and 5:00p.m., respectively. Attacks on the Union left flank. As Longstreet's left division, under Major General Lafayette McLaws, advanced, they unexpectedly found Major General Daniel Sickles's III Corps directly in their path. Sickles had been dissatisfied with the position assigned him on the southern end of Cemetery Ridge. Seeing ground better suited for artillery positions to the west—centered at the Sherfy farm's Peach Orchard—he violated orders and advanced his corps to the slightly higher ground along the Emmitsburg Road, moving away from Cemetery Ridge. The new line ran from Devil's Den, northwest to the Peach Orchard, then northeast along the Emmitsburg Road to south of the Codori farm. This created an untenable salient at the Peach Orchard; Brigadier General Andrew A. Humphreys's division (in position along the Emmitsburg Road) and Major General David B. Birney's division (to the south) were subject to attacks from two sides and were spread out over a longer front than their small corps could defend effectively. The Confederate artillery was ordered to open fire at 3:00 pm. After failing to attend a meeting at this time of Meade's corps commanders, Meade rode to Sickles's position and demanded an explanation of the situation. Knowing a Confederate attack was imminent and a retreat would be endangered, Meade refused Sickles' offer to withdraw. Meade was forced to send 20,000 reinforcements: the entire V Corps, Brigadier General John C. Caldwell's division of the II Corps, most of the XII Corps, and portions of the newly arrived VI Corps. Hood's division moved more to the east than intended, losing its alignment with the Emmitsburg Road, attacking Devil's Den and Little Round Top. McLaws, coming in on Hood's left, drove multiple attacks into the thinly stretched III Corps in the Wheatfield and overwhelmed them in Sherfy's Peach Orchard. McLaws's attack eventually reached Plum Run Valley (the "Valley of Death") before being beaten back by the Pennsylvania Reserves division of the V Corps, moving down from Little Round Top. The III Corps was virtually destroyed as a combat unit in this battle, and Sickles's leg was amputated after it was shattered by a cannonball. Caldwell's division was destroyed piecemeal in the Wheatfield. Anderson's division, coming from McLaws's left and starting forward around 6p.m., reached the crest of Cemetery Ridge, but could not hold the position in the face of counterattacks from the II Corps, including an almost suicidal bayonet charge by the 1st Minnesota regiment against a Confederate brigade, ordered in desperation by Hancock to buy time for reinforcements to arrive. As fighting raged in the Wheatfield and Devil's Den, Colonel Strong Vincent of V Corps had a precarious hold on Little Round Top, an important hill at the extreme left of the Union line. His brigade of four relatively small regiments was able to resist repeated assaults by Law's brigade of Hood's division. Meade's chief engineer, Brigadier General Gouverneur K. Warren, had realized the importance of this position, and dispatched Vincent's brigade, an artillery battery, and the 140th New York to occupy Little Round Top mere minutes before Hood's troops arrived. The defense of Little Round Top with a bayonet charge by the 20th Maine, ordered by Colonel Joshua L. Chamberlain and possibly led down the slope by Lieutenant Holman S. Melcher, was one of the most fabled episodes in the Civil War and propelled Chamberlain into prominence after the war. Attacks on the Union right flank. Ewell interpreted his orders as calling only for a cannonade. His 32 guns, along with A. P. Hill's 55 guns, engaged in a two-hour artillery barrage at extreme range that had little effect. Finally, about six o'clock, Ewell sent orders to each of his division commanders to attack the Union lines in his front. Major General Edward "Allegheny" Johnson's division had contemplated an assault on Culp's Hill, but they were still a mile away and had Rock Creek to cross. The few possible crossings would make significant delays. Because of this, only three of Johnson's four brigades moved to the attack. Most of the hill's defenders, the Union XII Corps, had been sent to the left to defend against Longstreet's attacks, leaving only a brigade of New Yorkers under Brigadier General George S. Greene behind strong, newly constructed defensive works. With reinforcements from the I and XI corps, Greene's men held off the Confederate attackers, though giving up some of the lower earthworks on the lower part of Culp's Hill. Early was similarly unprepared when he ordered Harry T. Hays's and Isaac E. Avery's brigades to attack the Union XI Corps positions on East Cemetery Hill. Once started, fighting was fierce: Colonel Andrew L. Harris of the Union 2nd Brigade, 1st Division, XI Corps came under a withering attack, losing half his men. Avery was wounded early on, but the Confederates reached the crest of the hill and entered the Union breastworks, capturing one or two batteries. Seeing he was not supported on his right, Hays withdrew. His right was to be supported by Robert E. Rodes's division, but Rodes—like Early and Johnson—had not been ordered up in preparation for the attack. He had twice as far to travel as Early; by the time he came in contact with the Union skirmish line, Early's troops had already begun to withdraw. Jeb Stuart and his three cavalry brigades arrived in Gettysburg around noon but had no role in the second day's battle. Brigadier General Wade Hampton's brigade fought a minor engagement with newly promoted 23-year-old Brigadier General George Armstrong Custer's Michigan cavalry near Hunterstown to the northeast of Gettysburg. Third day of battle. Lee's plan. Lee wished to renew the attack on Friday, July 3, using the same basic plan as the previous day: Longstreet would attack the Union left, while Ewell attacked Culp's Hill. However, before Longstreet was ready, Union XII Corps troops started a dawn artillery bombardment against the Confederates on Culp's Hill in an effort to regain a portion of their lost works. The Confederates attacked, and the second fight for Culp's Hill ended around 11a.m. Harry Pfanz judged that, after some seven hours of bitter combat, "the Union line was intact and held more strongly than before". Lee was forced to change his plans. Longstreet would command Pickett's Virginia division of his own First Corps, plus six brigades from Hill's Corps, in an attack on the Union II Corps position at the right center of the Union line on Cemetery Ridge. Prior to the attack, all the artillery the Confederacy could bring to bear on the Union positions would bombard and weaken the enemy's line. In his memoir, Longstreet states that he told Lee that there were not enough men to assault the strong left center of the Union line by McLaws's and Hood's divisions reinforced by Pickett's brigades. Longstreet thought the attack would be repulsed and a counterattack would put Union forces between the Confederates and the Potomac River. Longstreet wrote that he said it would take a minimum of thirty thousand men to attack successfully as well as close coordination with other Confederate forces. He noted that only about thirteen thousand men were left in the selected divisions after the first two days of fighting. They would have to walk a mile under heavy artillery and long-range musketry fire. Longstreet states that he further asked Lee: "the strength of the column. He [Lee] stated fifteen thousand. Opinion was then expressed [by Longstreet] that the fifteen thousand men who could make successful assault over that field had never been arrayed for battle; but he was impatient of listening, and tired of talking, and nothing was left but to proceed." Largest artillery bombardment of the war. Around 1p.m., from 150 to 170 Confederate guns began an artillery bombardment that was probably the largest of the war. To save valuable ammunition for the infantry attack that they knew would follow, the Army of the Potomac's artillery, under the command of Brigadier General Henry Jackson Hunt, at first did not return the enemy's fire. After waiting about 15 minutes, about 80 Union cannons opened fire. The Army of Northern Virginia was critically low on artillery ammunition, and the cannonade did not significantly affect the Union position. Pickett's Charge. Around 3p.m., the cannon fire subsided, and between 10,500 and 12,500 Confederate soldiers stepped from the ridgeline and advanced the three-quarters of a mile (1,200 m) to Cemetery Ridge. Pickett's division role in leading the attack has led to the attack to be known as Pickett's Charge. As the Confederates approached, there was fierce flanking artillery fire from Union positions on Cemetery Hill and the Little Round Top area, and musket and canister fire from Hancock's II Corps. In the Union center, the commander of artillery had held fire during the Confederate bombardment (to save it for the infantry assault, which Meade had correctly predicted the day before), leading Southern commanders to believe the Northern cannon batteries had been knocked out. However, they opened fire on the Confederate infantry during their approach with devastating results. Although the Union line wavered and broke temporarily at a jog called The Angle in a low stone fence, just north of a patch of vegetation called the Copse of Trees, reinforcements rushed into the breach, and the Confederate attack was repelled. The farthest advance, by Brigadier General Lewis A. Armistead's brigade of Pickett's division at the Angle, is referred to as the "high-water mark of the Confederacy". Union and Confederate soldiers locked in hand-to-hand combat, attacking with their rifles, bayonets, rocks and even their bare hands. Armistead ordered his Confederates to turn two captured cannons against Union troops, but discovered that there was no ammunition left, the last double canister shots having been used against the charging Confederates. Armistead was mortally wounded shortly afterward. Nearly one half of the Confederate attackers did not return to their own lines. Pickett's division lost about two-thirds of its men, and all three brigadiers were killed or wounded. Cavalry battles. There were two significant cavalry engagements on July 3. The first one was coordinated with Pickett's Charge, and the standoff may have prevented a disaster for Union infantry. The site of this engagement is now known as the East Cavalry Field. The second engagement was a loss for Union cavalry attacking Confederate infantry. It has been labeled as a "fiasco", and featured faulty cavalry tactics. The site of this engagement is now known as the South Cavalry Field. Northeast of Gettysburg. Stuart's cavalry division (three brigades), with the assistance of Jenkins' brigade, was sent to guard the Confederate left flank. Stuart was also in position to exploit any success the Confederate infantry (Pickett's Charge) might achieve on Cemetery Hill by flanking the Union right and getting behind Union infantry facing the Confederate attack. The cavalry fight took place about northeast of Gettysburg at about 3:00pm—around the end of the Confederate artillery barrage that preceded Pickett's charge. Stuart's forces collided with Union cavalry: Brigadier General David McMurtrie Gregg's division and Custer's brigade from Kilpatrick's division. The fight evolved into "a wild melee of swinging sabers and blazing pistols and carbines". One of Custer's regiments, the 5th Michigan Cavalry Regiment, was armed with Spencer repeating rifles, and at least two companies from an additional regiment were also armed with repeaters. The fight ended in a standoff, as neither side changed positions. However, Gregg and Custer prevented Stuart from gaining the rear of Union infantry facing Pickett. Southwest of Gettysburg. After hearing news of the Union's success against Pickett's charge, Brigadier General Judson Kilpatrick launched a cavalry attack against the infantry positions of Longstreet's Corps southwest of Big Round Top. The terrain was difficult for a mounted attack because it was rough, heavily wooded, and contained huge boulders—and Longstreet's men were entrenched with artillery support. Brigadier General Elon J. Farnsworth protested against the futility of such a move, but obeyed orders. Farnsworth was killed in the fourth of five unsuccessful attacks, and his brigade suffered significant losses. Although Kilpatrick was described by at least one Union leader as "brave, enterprising, and energetic", incidents such as Farnsworth's charge earned him the nickname of "Kill Cavalry". Aftermath. Casualties. The two armies suffered between 46,000 and 51,000 casualties. Union casualties were 23,055 (3,155 killed, 14,531 wounded, 5,369 captured or missing), while Confederate casualties are more difficult to estimate. Many authors have referred to as many as 28,000 Confederate casualties, and Busey and Martin's more recent 2005 work, "Regimental Strengths and Losses at Gettysburg", documents 23,231 (4,708 killed, 12,693 wounded, 5,830 captured or missing). Nearly a third of Lee's general officers were killed, wounded, or captured. The casualties for both sides for the 6-week campaign, according to Sears, were 57,225. In addition to being the deadliest battle of the war, Gettysburg also had the most generals killed in action. Several generals also were wounded. The Confederacy lost generals Paul Jones Semmes, William Barksdale, William Dorsey Pender, Richard Garnett, and Lewis Armistead, as well as J. Johnston Pettigrew during the retreat after the battle. Confederate generals who were wounded included Maj. Gen. John Bell Hood who lost the use of his left arm and Maj. Gen. Henry Heth who received a shot to the head on the first day of battle (though incapacitated for the rest of the battle, he remarkably survived without long-term injuries, credited in part due to his hat stuffed full of paper dispatches). Confederate generals James L. Kemper and Isaac R. Trimble were severely wounded during Pickett's charge and captured during the Confederate retreat. Confederate Brig. Gen. James J. Archer, in command of a brigade that most likely was responsible for killing Reynolds, was taken prisoner shortly after Reynolds' death. In the Confederate 1st Corps, eight of Longstreet's fourteen division and brigade commanders were killed or wounded, including Brig. Gen. George T. Anderson and Brig. Gen. Jerome B. Robertson, who were wounded. In Ewell's 2nd Corps, Brig. Gen. Isaac E. Avery was mortally wounded and Brig. Gen. John M. Jones was wounded. In Hill's 3rd Corps, in addition to Pender and Pettigrew being killed, Maj. Gen. Henry Heth and Col. Birkett D. Fry (later brigadier general), in temporary brigade command were wounded. In Hill's 3rd Corp, Brig. Gen. Alfred M. Scales and Col. William L. J. Lowrance, in temporary brigade command, were wounded. In the Confederate Cavalry Division, Brig. Gen. Wade Hampton and Brig. Gen. Albert G. Jenkins were wounded. Union generals killed were John Reynolds, Samuel K. Zook, and Stephen H. Weed, as well as Elon J. Farnsworth, assigned as brigadier general by Maj. Gen. Pleasanton based on his nomination although his promotion was confirmed posthumously, and Strong Vincent, who after being mortally wounded was given a deathbed promotion to brigadier general. Additional senior officer casualties included the wounding of Union Generals Dan Sickles (lost a leg), Francis C. Barlow, Daniel Butterfield, and Winfield Scott Hancock. Five of seven brigade commanders in Reynolds's First Corps were wounded. In addition to Hancock and Brig. Gen. John Gibbon being wounded in the Second Corps, three of ten brigade commanders were killed and three were wounded. The following tables summarize casualties by corps for the Union and Confederate forces during the three-day battle, according to Busey and Martin. Bruce Catton wrote, "The town of Gettysburg looked as if some universal moving day had been interrupted by catastrophe." But there was only one documented civilian death during the battle: Ginnie Wade (also widely known as Jennie), 20 years old, was hit by a stray bullet that passed through her kitchen in town while she was making bread. Another notable civilian casualty was John L. Burns, a 69-year-old veteran of the War of 1812 who walked to the front lines on the first day of battle and participated in heavy combat as a volunteer, receiving numerous wounds in the process. Though aged and injured, Burns survived the battle and lived until 1872. Nearly 8,000 had been killed outright; these bodies, lying in the hot summer sun, needed to be buried quickly. More than 3,000 horse carcasses were burned in a series of piles south of town; townsfolk became violently ill from the stench. Meanwhile, the town of Gettysburg, with its population of just 2,400, found itself tasked with taking care of 14,000 wounded Union troops and an additional 8,000 Confederate prisoners. Confederates lost over 31–55 battle flags, with the Union possibly having lost slightly fewer than 40. The Confederate battle flags were sent to Washington. 3,000–5,000 horses were killed. Confederate retreat. On the morning of July 4, with Lee's army still present, Meade ordered his cavalry to get to the rear of Lee's army. In a heavy rain, the armies stared at one another across the bloody fields, on the same day that, some away, the Vicksburg garrison surrendered to Major General Ulysses S. Grant. Lee had reformed his lines into a defensive position on Seminary Ridge the night of July3, evacuating the town of Gettysburg. The Confederates remained on the battlefield's west side, hoping that Meade would attack, but the cautious Union commander decided against the risk, a decision for which he would later be criticized. Both armies began to collect their remaining wounded and bury some of the dead. A proposal by Lee for a prisoner exchange was rejected by Meade. Late in the rainy afternoon, Lee started moving the non-fighting portion of his army back to Virginia. Cavalry under Brigadier General John D. Imboden was entrusted to escort the seventeen-mile long wagon train of supplies and wounded men, using a long route through Cashtown and Greencastle to Williamsport, Maryland. After sunset, the fighting portion of Lee's army began its retreat to Virginia using a more direct (but more mountainous) route that began on the road to Fairfield. Although Lee knew exactly what he needed to do, Meade's situation was different. Meade needed to remain at Gettysburg until he was certain Lee was gone. If Meade left first, he could possibly leave an opening for Lee to get to Washington or Baltimore. In addition, the army that left the battlefield first was often considered the defeated army. Union cavalry had some minor successes pursuing Lee's army. The first major encounter took place in the mountains at Monterey Pass on July4, where Kilpatrick's cavalry division captured 150 to 300 wagons and took 1,300 to 1,500 prisoners. Beginning July 6, additional cavalry fighting took place closer to the Potomac River in Maryland's Williamsport-Hagerstown area. Lee's army was trapped and delayed from crossing the Potomac River because rainy weather had caused the river to swell, and the pontoon bridge at Falling Waters had been destroyed. Meade's infantry did not fully pursue Lee until July7, and despite repeated pleas from Lincoln and Halleck, was not aggressive enough to destroy Lee's army. A new pontoon bridge was constructed at Falling Waters, and lower water levels allowed the Confederates to begin crossing after dark on July13. Although Meade's infantry had reached the area on July 12, it was his cavalry that attacked the Confederate rear guard on the morning of July14. Union cavalry took 500 prisoners, and Confederate Brigadier General Pettigrew was mortally wounded, but Lee's army completed its Potomac crossing. The campaign continued south of the Potomac until the Battle of Manassas Gap on July23, when Lee escaped and Meade abandoned the pursuit. Union reaction to the news of the victory. The news of the Union victory electrified the North. A headline in "The Philadelphia Inquirer" proclaimed, "Victory! Waterloo Eclipsed!" New York diarist George Templeton Strong wrote: Union enthusiasm soon dissipated, however, as the public realized that Lee's army had escaped destruction and the war would continue. Lincoln complained to Gideon Welles, his Secretary of the Navy, that "Our army held the war in the hollow of their hand and they would not close it!" Brigadier General Alexander S. Webb wrote to his father on July 17, stating that such Washington politicians as "Chase, Seward and others", disgusted with Meade, "write to me that Lee really won that Battle!" Effect on the Confederacy. In fact, the Confederates had lost militarily and also politically. During the final hours of the battle, Confederate Vice President Alexander Stephens was approaching the Union lines at Norfolk, Virginia, under a flag of truce. Although his formal instructions from Confederate President Jefferson Davis had limited his powers to negotiate on prisoner exchanges and other procedural matters, historian James M. McPherson speculates that he had informal goals of presenting peace overtures. Davis had hoped that Stephens would reach Washington from the south while Lee's victorious army was marching toward it from the north. President Lincoln, upon hearing of the Gettysburg results, refused Stephens's request to pass through the lines. Furthermore, when the news reached London, any lingering hopes of European recognition of the Confederacy were finally abandoned. Henry Adams, whose father was serving as the U.S. ambassador to the United Kingdom at the time, wrote, "The disasters of the rebels are unredeemed by even any hope of success. It is now conceded that all idea of intervention is at an end." Compounding the effects of the defeat was the end of the Siege of Vicksburg, which surrendered to Grant's Federal armies in the West on July 4, the day after the Gettysburg battle, costing the Confederacy an additional 30,000 men, along with all their arms and stores. The immediate reaction of the Southern military and public sectors was that Gettysburg was a setback, not a disaster. The sentiment was that Lee had been successful on July 1 and had fought a valiant battle on July 2–3, but could not dislodge the Union Army from the strong defensive position to which it fled. The Confederates successfully stood their ground on July 4 and withdrew only after they realized Meade would not attack them. The withdrawal to the Potomac that could have been a disaster was handled masterfully. Furthermore, the Army of the Potomac had been kept away from Virginia farmlands for the summer and all predicted that Meade would be too timid to threaten them for the rest of the year. Lee himself had a positive view of the campaign, writing to his wife that the army had returned "rather sooner than I had originally contemplated, but having accomplished what I proposed on leaving the Rappahannock, viz., relieving the Valley of the presence of the enemy and drawing his Army north of the Potomac". He was quoted as saying to Maj. John Seddon, brother of the Confederate secretary of war, "Sir, we did whip them at Gettysburg, and it will be seen for the next six months that "that army" will be as quiet as a sucking dove." Some Southern publications, such as the "Charleston Mercury", were critical of Lee's actions. On August 8, Lee offered his resignation to President Davis, who quickly rejected it. Gettysburg Address. The ravages of war were still evident in Gettysburg more than four months later when, on November 19, Soldiers' National Cemetery, later renamed Gettysburg National Cemetery, was dedicated. During this ceremony, Lincoln honored the fallen and redefined the purpose of the war in his historic Gettysburg Address, a 271-word address that is widely considered one of the most famous and significant speeches in American history. Medal of Honor. There were 72 Medals of Honor awarded for the Gettysburg Campaign, 64 of which were for actions taken during the battle itself. The first recipient was awarded in December 1864, while the most recent was posthumously awarded to Lieutenant Alonzo Cushing in 2014. Historical assessment. Decisive victory controversies. The nature of the result of the Battle of Gettysburg has been the subject of controversy. Although not seen as overwhelmingly significant at the time, particularly since the war continued for almost two years, in retrospect it has often been cited as the "turning point", usually in combination with the fall of Vicksburg the following day. This is based on the observation that, after Gettysburg, Lee's army conducted no more strategic offensives—his army merely reacted to the initiative of Ulysses S. Grant in 1864 and 1865—and by the speculative viewpoint of the Lost Cause writers that a Confederate victory at Gettysburg might have resulted in the end of the war. It is currently a widely held view that Gettysburg was a decisive victory for the Union, but the term is considered imprecise. It is inarguable that Lee's offensive on July 3 was turned back decisively and his campaign in Pennsylvania was terminated prematurely (although the Confederates at the time argued that this was a temporary setback and that the goals of the campaign were largely met). However, when the more common definition of "decisive victory" is intended—an indisputable military victory of a battle that determines or significantly influences the ultimate result of a conflict—historians are divided. For example, David J. Eicher called Gettysburg a "strategic loss for the Confederacy" and James M. McPherson wrote that "Lee and his men would go on to earn further laurels. But they never again possessed the power and reputation they carried into Pennsylvania those palmy summer days of 1863." However, Herman Hattaway and Archer Jones wrote that the "strategic impact of the Battle of Gettysburg was ... fairly limited." Steven E. Woodworth wrote that "Gettysburg proved only the near impossibility of decisive action in the Eastern theater." Edwin Coddington pointed out the heavy toll on the Army of the Potomac and that "after the battle Meade no longer possessed a truly effective instrument for the accomplishments of his task. The army needed a thorough reorganization with new commanders and fresh troops, but these changes were not made until Grant appeared on the scene in March 1864." Joseph T. Glatthaar wrote that "Lost opportunities and near successes plagued the Army of Northern Virginia during its Northern invasion," yet after Gettysburg, "without the distractions of duty as an invading force, without the breakdown of discipline, the Army of Northern Virginia [remained] an extremely formidable force." Ed Bearss wrote, "Lee's invasion of the North had been a costly failure. Nevertheless, at best the Army of the Potomac had simply preserved the strategic stalemate in the Eastern Theater ..." Historian Alan Guelzo notes that Gettysburg and Vicksburg did not end the war and that the war would go on for two more years. He also noted that a little more than a year later Federal armies appeared hopelessly mired in sieges at Petersburg and Atlanta. Peter Carmichael refers to the military context for the armies, the "horrendous losses at Chancellorsville and Gettysburg, which effectively destroyed Lee's offensive capacity," implying that these cumulative losses were not the result of a single battle. Thomas Goss, writing in the U.S. Army's "Military Review" journal on the definition of "decisive" and the application of that description to Gettysburg, concludes: "For all that was decided and accomplished, the Battle of Gettysburg fails to earn the label 'decisive battle'." The military historian John Keegan agrees. Gettysburg was a landmark battle, the largest of the war and it would not be surpassed. The Union had restored to it the belief in certain victory, and the loss dispirited the Confederacy. If "not exactly a decisive battle", Gettysburg was the end of Confederate use of Northern Virginia as a military buffer zone, the setting for Grant's Overland Campaign. Lee vs. Meade. Prior to Gettysburg, Robert E. Lee had established a reputation as an almost invincible general, achieving stunning victories against superior numbers—although usually at the cost of high casualties to his army—during the Seven Days, the Northern Virginia Campaign (including the Second Battle of Bull Run), Fredericksburg, and Chancellorsville. Only the Maryland Campaign, with its tactically inconclusive Battle of Antietam, had been less than successful. Therefore, historians such as Fuller, Glatthaar, and Sears have attempted to explain how Lee's winning streak was interrupted so dramatically at Gettysburg. Although the issue is tainted by attempts to portray history and Lee's reputation in a manner supporting different partisan goals, the major factors in Lee's loss arguably can be attributed to: his overconfidence in the invincibility of his men; the performance of his subordinates, and his management thereof; his failing health; and, the performance of his opponent, George G. Meade, and the Army of the Potomac. Throughout the campaign, Lee was influenced by the belief that his men were invincible; most of Lee's experiences with the Army of Northern Virginia had convinced him of this, including the great victory at Chancellorsville in early May and the rout of the Union troops at Gettysburg on July 1. Since morale plays an important role in military victory when other factors are equal, Lee did not want to dampen his army's desire to fight and resisted suggestions, principally by Longstreet, to withdraw from the recently captured Gettysburg to select a ground more favorable to his army. War correspondent Peter W. Alexander wrote that Lee "acted, probably, under the impression that his troops were able to carry any position however formidable. If such was the case, he committed an error, such however as the ablest commanders will sometimes fall into." Lee himself concurred with this judgment, writing to President Davis, "No blame can be attached to the army for its failure to accomplish what was projected by me, nor should it be censured for the unreasonable expectations of the public—I am alone to blame, in perhaps expecting too much of its prowess and valor." The most controversial assessments of the battle involve the performance of Lee's subordinates. The dominant theme of the Lost Cause writers and many other historians is that Lee's senior generals failed him in crucial ways, directly causing the loss of the battle; the alternative viewpoint is that Lee did not manage his subordinates adequately, and did not thereby compensate for their shortcomings. Two of his corps commanders—Richard S. Ewell and A.P. Hill—had only recently been promoted and were not fully accustomed to Lee's style of command, in which he provided only general objectives and guidance to their former commander, Stonewall Jackson; Jackson translated these into detailed, specific orders to his division commanders. All four of Lee's principal commanders received criticism during the campaign and battle: In addition to Hill's illness, Lee's performance was affected by heart troubles, which would eventually lead to his death in 1870; he had been diagnosed with pericarditis by his staff physicians in March 1863, though modern doctors believe he had in fact suffered a heart attack. As a final factor, Lee faced a new and formidable opponent in George G. Meade, and the Army of the Potomac fought well on its home territory. Although new to his army command, Meade deployed his forces relatively effectively; relied on strong subordinates such as Winfield S. Hancock to make decisions where and when they were needed; took great advantage of defensive positions; nimbly shifted defensive resources on interior lines to parry strong threats; and, unlike some of his predecessors, stood his ground throughout the battle in the face of fierce Confederate attacks. Lee was quoted before the battle as saying Meade "would commit no blunders on my front and if I make one ... will make haste to take advantage of it." That prediction proved to be correct at Gettysburg. Stephen Sears wrote, "The fact of the matter is that George G. Meade, unexpectedly and against all odds, thoroughly outgeneraled Robert E. Lee at Gettysburg." Edwin B. Coddington wrote that the soldiers of the Army of the Potomac received a "sense of triumph which grew into an imperishable faith in [themselves]. The men knew what they could do under an extremely competent general; one of lesser ability and courage could well have lost the battle." Meade had his own detractors as well. Similar to the situation with Lee, Meade suffered partisan attacks about his performance at Gettysburg, but he had the misfortune of experiencing them in person. Supporters of his predecessor, Hooker, lambasted Meade before the U.S. Congress's Joint Committee on the Conduct of the War, where Radical Republicans suspected that Meade was a Copperhead and tried in vain to relieve him from command. Daniel E. Sickles and Daniel Butterfield accused Meade of planning to retreat from Gettysburg during the battle. Most politicians, including Lincoln, criticized Meade for what they considered to be his half-hearted pursuit of Lee after the battle. A number of Meade's most competent subordinates—Winfield S. Hancock, John Gibbon, Gouverneur K. Warren, and Henry J. Hunt, all heroes of the battle—defended Meade in print, but Meade was embittered by the overall experience. Battlefield preservation. Gettysburg National Cemetery and Gettysburg National Military Park are maintained by the U.S. National Park Service as two of the nation's most revered historical landmarks. Although Gettysburg is one of the best known of all Civil War battlefields, it too faces threats to its preservation and interpretation. Many historically significant locations on the battlefield lie outside the boundaries of Gettysburg National Military Park and are vulnerable to residential or commercial development. Some preservation successes have emerged in recent years. Two proposals to open a casino at Gettysburg were defeated in 2006 and most recently in 2011, when public pressure forced the Pennsylvania Gaming Control Board to reject the proposed gambling hub at the intersection of Routes 15 and 30, near East Cavalry Field. The American Battlefield Trust, formerly the Civil War Trust, also successfully purchased and transferred at the former site of the Gettysburg Country Club to the control of the U.S. Department of the Interior in 2011. Less than half of the over 11,500 acres on the old Gettysburg Battlefield have been preserved for posterity thus far. The American Battlefield Trust and its partners have acquired and preserved of the battlefield in more than 40 separate transactions from 1997 to mid-2023. Some of these acres are now among the of the Gettysburg National Military Park. In 2015, the Trust made one of its most important and expensive acquisitions, paying $6 million for a parcel that included the stone house that Confederate General Robert E. Lee used as his headquarters during the battle. The Trust razed a motel, restaurant and other buildings within the parcel to restore Lee's headquarters and the site to their wartime appearance, adding interpretive signs. It opened the site to the public in October 2016. In popular culture. At the 50th anniversary Gettysburg reunion (1913), 50,000 veterans attended according to a 1938 Army Medical report. Historian Carol Reardon writes that attendance included at least 35,000 Union veterans and though estimates of attendees ran as high as 56,000, only a few more than 7,000 Confederate veterans, most from Virginia and North Carolina, attended. Some veterans re-enacted Pickett's Charge in a spirit of reconciliation, a meeting that carried great emotional force for both sides. There was a ceremonial mass hand-shake across a stone wall on Cemetery Ridge. At the 75th anniversary Gettysburg reunion (1938), 1,333 Union veterans and 479 Confederate veterans attended. Film records survive of two Gettysburg reunions, held on the battlefield, in 1913, and 1938. During the Civil War Centennial, the U.S. Post Office issued commemorating the 100th anniversaries of famous battles. The American Civil War Centennial was the official United States commemoration of the American Civil War. Commemoration activities began in 1957, four years before the 100th anniversary of the war's first battle, and ended in 1965 with the 100th anniversary of the surrender at Appomattox. The United States Civil War Centennial Commission and state commissions were established to organize the commemorations. The National Park Service, and other federal agencies that controlled key Civil War battlefields, used the Centennial to successfully lobby Congress for increased funding to re-landscape and interpret these battlefields for the general public. The Post Office issued a series of noncontroversial commemorative stamps to mark the centennial. The children's novel "Window of Time" (1991), by Karen Weinberg, tells the story of a boy transported by time travel from the 1980s to the Battle of Gettysburg. The Battle of Gettysburg was depicted in the 1993 film "Gettysburg", based on Michael Shaara's 1974 novel "The Killer Angels". The film and novel focused primarily on the actions of Joshua Lawrence Chamberlain, John Buford, Robert E. Lee, and James Longstreet during the battle. The first day focused on Buford's cavalry defense, the second day on Chamberlain's defense at Little Round Top, and the third day on Pickett's Charge.
4851
622881
https://en.wikipedia.org/wiki?curid=4851
Budweiser
Budweiser () is an American-style pale lager, a brand of Belgian company AB InBev. Introduced in 1876 by Carl Conrad & Co. of St. Louis, Missouri, Budweiser has become a large selling beer company in the United States. Budweiser is a filtered beer, available on draft and in bottles and cans, made with up to 30% rice in addition to hops and barley malt. There is an ongoing series of trademark disputes between Anheuser-Busch and the Czech company Budweiser Budvar Brewery over the use of the name. Usually, either Anheuser-Busch or Budweiser Budvar is granted the exclusive use of the "Budweiser" name in a given market. The Anheuser-Busch lager is available in over 80 countries, but is marketed as "Bud" in areas where Budvar has use of the Budweiser name. Name origin and dispute. The name "Budweiser" is a German derivative adjective, meaning "of Budweis". Beer has been brewed in Budweis, Bohemia (now České Budějovice, Czech Republic) since it was founded in 1265. In 1876, Adolphus Busch and his friend Carl Conrad developed a "Bohemian-style" lager in the United States, inspired after a trip to Bohemia, and produced it in their brewery in St. Louis, Missouri. Anheuser–Busch has been involved in multiple trademark disputes with the Budweiser Budvar Brewery of České Budějovice over the trademark rights to the name "Budweiser". In the European Union, except Ireland, Sweden, Finland and Spain, the American beer may only be marketed as "Bud", as the Budweiser trademark name is owned solely by the Czech beer maker Budweiser Budvar. In some countries, such as the United Kingdom, both the Budvar and Anheuser–Busch lagers are available under the Budweiser name, though their logos differ. Marketing. The Budweiser from Budějovice has been called "The Beer of Kings" since the 16th century. Adolphus Busch adapted this slogan to "The King of Beers." This history notwithstanding, Anheuser Busch owns the trademark to these slogans in the United States. In 1969 AB introduced the Superman-esque advertising character of Bud Man. Bud Man served as one of the inspiration behind several characters including "The Simpsons's" Duffman. From 1987 to 1989, Bud Light ran an advertising campaign centered around canine mascot Spuds MacKenzie. In 2010, the Bud Light brand paid $1 billion for a six-year licensing agreement with the NFL. Budweiser pays $20 million annually for MLB licensing rights. Budweiser has produced a number of TV advertisements, such as the Budweiser Frogs, lizards impersonating the Budweiser frogs, a campaign built around the phrase "Whassup?", and a team of Clydesdale horses commonly known as the Budweiser Clydesdales. Budweiser also advertises in motorsports, from Bernie Little's Miss Budweiser hydroplane boat to sponsorship of the Budweiser King Top Fuel Dragster driven by Brandon Bernstein. Anheuser-Busch has sponsored the CART championship. It is the "Official Beer of NHRA" and it was the "Official Beer of NASCAR" from 1998 to 2007. It has sponsored motorsport events such as the Daytona Speedweeks, Budweiser Shootout, Budweiser Duel, Budweiser Pole Award, Budweiser 500, Budweiser 400, Budweiser 300, Budweiser 250, Budweiser 200, and Carolina Pride / Budweiser 200. However, starting in 2016, the focus of A-B's NASCAR sponsorship became its Busch brand. Budweiser has sponsored NASCAR teams such as Junior Johnson, Hendrick Motorsports, DEI, and Stewart-Haas Racing. Sponsored drivers include Dale Earnhardt Jr. (1999–2007), Kasey Kahne (2008–2010), and Kevin Harvick (2011–2015). In IndyCar, Budweiser sponsored Mario Andretti (1983–1984), Bobby Rahal (1985–1988), Scott Pruett (1989–1992), Roberto Guerrero (1993), Scott Goodyear (1994), Paul Tracy (1995), Christian Fittipaldi (1996–1997), and Richie Hearn (1998–1999). Between 2003 and 2006, Budweiser was a sponsor of the BMW Williams Formula One team. Anheuser-Busch has placed Budweiser as an official partner and sponsor of Major League Soccer and Los Angeles Galaxy and was the headline sponsor of the British Basketball League in the 1990s. Anheuser-Busch has also placed Budweiser as an official sponsor of the Premier League and the presenting sponsor of the FA Cup. In the early 20th century, the company commissioned a play-on-words song called "Under the Anheuser Bush," which was recorded by several early phonograph companies. In 2009, Anheuser-Busch partnered with popular Chinese video-sharing site Tudou.com for a user-generated online video contest. The contest encouraged users to submit ideas that included ants for a Bud TV spot set to run in February 2010 during Chinese New Year. In 2010, Budweiser produced an online reality TV series centered around the 2010 FIFA World Cup in South Africa called "Bud House", following the lives of 32 international soccer fans (one representing each nation in the World Cup) living together in a house in South Africa. Anheuser-Busch advertises the Budweiser brand heavily, expending $449 million in 2012 in the United States alone. Presenting Budweiser as the most advertised drink brand in America, and accounted for a third of the company's US marketing budget. On November 5, 2012, Anheuser-Busch asked Paramount Pictures to obscure or remove the Budweiser logo from the film "Flight" (2012), directed by Robert Zemeckis and starring Denzel Washington. In an advertisement titled "Brewed the Hard Way" which aired during Super Bowl XLIX, Budweiser touted itself as "Proudly A Macro Beer", distinguishing it from smaller production craft beers. In 2016, Beer Park by Budweiser opened on the Las Vegas Strip. On October 7, 2016, the Budweiser Clydesdales made a special appearance on the Danforth Campus at Washington University in St. Louis ahead of the presidential debate. A special batch beer named Lilly's Lager was exclusively brewed for the occasion. In December 2020, Budweiser sent personalized bottles of beer to every goalkeeper who Lionel Messi had scored against. Containers and packaging. Containers. Budweiser has been distributed in many sizes and variety of containers. Until the early 1950s, Budweiser was primarily distributed in three packages: kegs, bottles and bottles. Cans were first introduced in 1936. In 1955 August Busch Jr. made a strategic move to expand Budweiser's national brand and distributor presence. Along with this expansion came advances in bottling automation, bottling materials and distribution methods. These advances brought new containers and package designs. Budweiser is distributed in four large container volumes: half-barrel kegs (), quarter-barrel kegs (), 1/6 barrel kegs () and "beer balls". Budweiser produces a variety of cans and bottles ranging from . On August 3, 2011, Anheuser-Busch announced its twelfth can design since 1936, one which emphasizes the bowtie. Packages are sometimes tailored to local customs and traditions. In St. Mary's County, Maryland, fluid ounce cans are the preferred package. Cans. In an attempt to re-stimulate interest in their beer after the repeal of Prohibition, Budweiser began canning their beer in 1936. This new packaging led to an increase in sales which lasted until the start of World War II in 1939. Over the years, Budweiser cans have undergone various design changes in response to market conditions and consumer tastes. Since 1936, 12 major can design changes have occurred, not including the temporary special edition designs. Budweiser cans have traditionally displayed patriotic American symbols, such as eagles and the colors red, white, and blue. In 2011, there was a branding redesign that eliminated some of the traditional imagery. The new design was largely in response to a large decline in sales threatening Budweiser's status as America's best-selling beer. In order to regain the domestic market share that Budweiser had lost, the company tried to update its appearance by giving the can a more contemporary look. The company hoped that the new design will offset the effects that unemployment had on its sales. Although the more modern design was intended for young male Americans, the new design was also part of an attempt to focus on the international market. Budweiser began selling its beer in Russia in 2010, and is currently expanding its operations in China. The beer. Budweiser is produced using malted barley, rice, water, hops and yeast. The brewing happens in seven steps: milling, mashing, straining, brew kettle, primary fermentation, beechwood lagering and finishing. It is lagered with beechwood chips in the aging vessel. Because the beechwood chips are boiled in sodium bicarbonate (baking soda) for seven hours beforehand, there is little to no flavor contribution from the wood. The maturation tanks that Anheuser-Busch uses are horizontal, causing flocculation of yeast to occur much more quickly. Anheuser-Busch refers to this process as a secondary fermentation, with the idea being that the chips give the yeast more surface area to rest on. This is combined with a krausening procedure that re-introduces wort into the chip tank, reactivating the fermentation process. Placing beechwood chips at the bottom of the tank keeps the yeast in suspension longer, giving it more time to reabsorb and process green beer flavors such as acetaldehyde and diacetyl that Anheuser-Busch believes are off-flavors which detract from overall drinkability. Budweiser and "Bud Light" are sometimes advertised as vegan beers, in that their ingredients and conditioning do not use animal by-products. Some people object to the inclusion of genetically engineered rice and animal products used in the brewing process. In July 2006, Anheuser-Busch brewed a version of Budweiser with organic rice for sale in Mexico. It has yet to extend this practice to any other countries. Budweiser brands. In addition to the regular Budweiser, Anheuser-Busch brews several different beers under the Budweiser brand, including Bud Light, Bud Ice, and Bud Light Lime. In July 2010, Anheuser-Busch launched Budweiser 66 in the United Kingdom. Budweiser Brew No.66 has 4% alcohol by volume, and is brewed and distributed in the UK by Inbev UK Limited. In 2020, Budweiser introduced Bud Light Seltzer. In August 2020, Bud Light Seltzers added grapefruit, cranberry and pineapple flavors, to its original offerings of black cherry, mango, lemon lime and strawberry. In October 2020, Bud Light Seltzers added Apple Crisp, Peppermint Pattie, and Gingersnap, with the cans sporting "ugly sweater" designs. In July 2020, Budweiser introduced Bud Zero, its first alcohol-free low-calorie beer. It has zero sugar, zero alcohol, and 50 calories. Temporary "America" labeling. On May 10, 2016, "Advertising Age" reported that the Alcohol and Tobacco Tax and Trade Bureau had approved new Budweiser labels to be used on 12-ounce cans and bottles from May 23 until the November elections. The name "Budweiser" was changed to "America". Much of the text on the packaging was replaced with patriotic American slogans, such as E pluribus unum and "Liberty & Justice For All". International production. Budweiser is licensed, produced and distributed in Canada by Labatt Brewing Company (also owned by AB InBev). Of the 15 Anheuser-Busch breweries outside of the United States, 14 of them are positioned in China. Budweiser is the fourth leading brand in the Chinese beer market.
4854
7903804
https://en.wikipedia.org/wiki?curid=4854
Bermuda Triangle
The Bermuda Triangle, also known as the Devil's Triangle, is a loosely defined region in the North Atlantic Ocean, roughly bounded by Florida, Bermuda, and Puerto Rico. Since the mid-20th century, it has been the focus of an urban legend suggesting that many aircraft, ships, and people have disappeared there under mysterious circumstances. However, extensive investigations by reputable sources, including the U.S. government and scientific organizations, have found no evidence of unusual activity, attributing reported incidents to natural phenomena, human error, and misinterpretation. Origins. The earliest suggestion of unusual disappearances in the Bermuda area appeared in an article written by Edward Van Winkle Jones of the "Miami Herald" that was distributed by the Associated Press and appeared in various American newspapers on 17 September 1950. Two years later, "Fate" magazine published "Sea Mystery at Our Back Door": a short article, by George X. Sand, that was the first to lay out the now-familiar triangular area where the losses took place. Sand recounted the loss of several planes and ships since World War II: the disappearance of "Sandra", a tramp steamer; the December 1945 loss of Flight 19, a group of five US Navy torpedo bombers on a training mission; the January 1948 disappearance of "Star Tiger", a British South American Airways (BSAA) passenger airplane; the March 1948 disappearance of a fishing skiff with three men, including jockey Albert Snider; the December 1948 disappearance of an Airborne Transport DC-3 charter flight en route from Puerto Rico to Miami; and the January 1949 disappearance of "Star Ariel", another BSAA passenger airplane. Flight 19 was covered again in the April 1962 issue of "The American Legion Magazine". In it, author Allan W. Eckert wrote that the flight leader had been heard saying, "We cannot be sure of any direction ... everything is wrong ... strange ... the ocean doesn't look as it should." In February 1964, Vincent Gaddis wrote an article called "The Deadly Bermuda Triangle" in "Argosy" saying Flight 19 and other disappearances were part of a pattern of strange events in the region, dating back to at least 1840. The next year, Gaddis expanded this article into a book, "Invisible Horizons". Other writers elaborated on Gaddis' ideas, including John Wallace Spencer ("Limbo of the Lost", 1969, repr. 1973); Charles Berlitz ("The Bermuda Triangle", 1974); and Richard Winer ("The Devil's Triangle", 1974). Various of these authors incorporated supernatural elements. Triangle area. Sand's article in "Fate" described the area as "a watery triangle bounded roughly by Florida, Bermuda and Puerto Rico". The "Argosy" article by Gaddis further delineated the boundaries, giving its vertices as Miami, San Juan, and Bermuda. Subsequent writers did not necessarily follow this definition. Some writers gave different boundaries and vertices to the triangle, with the total area varying from . "Indeed, some writers even stretch it as far as the Irish coast," according to a 1977 BBC program. Consequently, the determination of which accidents occurred inside the triangle depends on which writer reported them. Criticism of the concept. Larry Kusche. Larry Kusche, author of "The Bermuda Triangle Mystery: Solved" (1975), argued that many claims of Gaddis and subsequent writers were exaggerated, dubious or unverifiable. Kusche's research revealed a number of inaccuracies and inconsistencies between Berlitz's accounts and statements from eyewitnesses, participants, and others involved in the initial incidents. Kusche noted cases where pertinent information went unreported, such as the disappearance of round-the-world yachtsman Donald Crowhurst, which Berlitz had presented as a mystery, despite clear evidence to the contrary. Another example was the ore-carrier recounted by Berlitz as lost without trace three days out of an Atlantic port when in fact it had been lost three days out of a port with the same name in the "Pacific" Ocean. Kusche also argued that a large percentage of the incidents that sparked allegations of the Triangle's mysterious influence actually occurred well outside it. Often his research was simple: he would review period newspapers of the dates of reported incidents and find reports on possibly relevant events, like unusual weather, that were never mentioned in the disappearance stories. Kusche concluded: Further responses. When the British Channel 4 television program "The Bermuda Triangle" (1992) was being produced by John Simmons of Geofilms for the "Equinox" series, the marine insurance market Lloyd's of London was asked if an unusually large number of ships had sunk in the Bermuda Triangle area. Lloyd's determined that large numbers of ships had not sunk there. Lloyd's does not charge higher rates for passing through this area. United States Coast Guard records confirm their conclusion. In fact, the number of supposed disappearances is relatively insignificant considering the number of ships and aircraft that pass through on a regular basis. The Coast Guard is also officially skeptical of the Triangle, noting that they collect and publish, through their inquiries, much documentation contradicting many of the incidents written about by the Triangle authors. In one such incident involving the 1972 explosion and sinking of the tanker , the Coast Guard photographed the wreck and recovered several bodies, in contrast with one Triangle author's claim that all the bodies had vanished, with the exception of the captain, who was found sitting in his cabin at his desk, clutching a coffee cup. In addition, "V. A. Fogg" sank off the coast of Texas, nowhere near the commonly accepted boundaries of the Triangle. "Nova"/"Horizon" aired the episode "The Case of the Bermuda Triangle" on 27 June 1976. The episode was highly critical, stating that "When we've gone back to the original sources or the people involved, the mystery evaporates. Science does not have to answer questions about the Triangle because those questions are not valid in the first place ... Ships and planes behave in the Triangle the same way they behave everywhere else in the world." Skeptical researchers, such as Ernest Taves and Barry Singer, have noted how mysteries and the paranormal are very popular and profitable. This has led to the production of vast amounts of material on topics such as the Bermuda Triangle. They were able to show that some of the pro-paranormal material is often misleading or inaccurate, but its producers continue to market it. Accordingly, they have claimed that the market is biased in favor of books, TV specials, and other media that support the Triangle mystery, and against well-researched material if it espouses a skeptical viewpoint. In a 2013 study, the World Wide Fund for Nature identified the world's 10 most dangerous waters for shipping, but the Bermuda Triangle was not among them. Benjamin Radford, an author and scientific paranormal investigator, noted in an interview on the Bermuda Triangle that it could be very difficult to locate an aircraft lost at sea due to the vast search area, and although the disappearance might be mysterious, that did not make it paranormal or unexplainable. Radford further noted the importance of double-checking information as the mystery surrounding the Bermuda Triangle had been created by people who had neglected to do so. NOAA attributes most Bermuda Triangle disappearances to environmental factors such as hurricanes, sudden weather shifts from the Gulf Stream, and hazardous shallow waters. The U.S. Navy dismisses supernatural claims, emphasizing natural causes and human error. Additionally, the U.S. Board on Geographic Names does not list the Bermuda Triangle as an official location, given the lack of evidence distinguishing it from other ocean regions. Hypothetical explanation attempts. Persons accepting the Bermuda Triangle as a real phenomenon have offered a number of explanatory approaches. Paranormal explanations. Triangle writers have used a number of supernatural concepts to explain the events. One explanation pins the blame on leftover technology from the mythical lost city or state of Atlantis. Sometimes connected to the Atlantis story is the submerged rock formation known as the Bimini Road off the island of Bimini in the Bahamas, which is in the Triangle by some definitions. Followers of the purported psychic Edgar Cayce take his prediction that evidence of Atlantis would be found in 1968 as referring to the discovery of the Bimini Road. Believers describe the formation as a road, wall, or other structure, but the Bimini Road is of natural origin. Some hypothesize that a parallel universe exists in the Bermuda Triangle region, causing a time/space warp that sucks the objects around it into a parallel universe. Others attribute the events to UFOs. Charles Berlitz, author of various books on anomalous phenomena, lists several theories attributing the losses in the Triangle to anomalous or unexplained forces. Natural explanations. Compass variations. Compass issues are frequently cited in accounts of Triangle incidents. While some have theorized that unusual local magnetic anomalies may exist in the area, such anomalies have not been found. Compasses have natural magnetic variations in relation to the magnetic poles, a fact that navigators have known for centuries. Magnetic (compass) north and geographic (true) north are exactly the same only for a small number of places – for example, , in the United States, only those places on a line running from Wisconsin to the Gulf of Mexico. But the public may not be as informed, and think there is something mysterious about a compass "changing" across an area as large as the Triangle, which it naturally will. Gulf Stream. The Gulf Stream (Florida Current) is a major surface current, primarily driven by thermohaline circulation that originates in the Gulf of Mexico and then flows through the Straits of Florida into the North Atlantic. In essence, it is a river within an ocean, and, like a river, it can and does carry floating objects. It has a maximum surface velocity of about . A small plane making a water landing or a boat having engine trouble can be carried away from its reported position by the current. Human error. One of the most cited explanations in official inquiries as to the loss of any aircraft or vessel is human error. Human stubbornness may have caused businessman Harvey Conover to lose his sailing yacht, "Revonoc", as he sailed into the teeth of a storm south of Florida on 1 January 1958. Violent weather. Hurricanes are powerful storms which form in tropical waters and have historically cost thousands of lives and caused billions of dollars in damage. The sinking of Francisco de Bobadilla's Spanish fleet in 1502 was the first recorded instance of a destructive hurricane. These storms have in the past caused a number of incidents related to the Triangle. Many Atlantic hurricanes pass through the Triangle as they recurve off the Eastern Seaboard, and, before the advent of weather satellites, ships often had little to no warning of a hurricane's approach. A powerful downdraft of cold air was suspected to be a cause in the sinking of "Pride of Baltimore" on 14 May 1986. The crew of the sunken vessel noted the wind suddenly shifted and increased velocity from to . A National Hurricane Center satellite specialist, James Lushine, stated "during very unstable weather conditions the downburst of cold air from aloft can hit the surface like a bomb, exploding outward like a giant squall line of wind and water." Methane hydrates. An explanation for some of the disappearances has focused on the presence of large fields of methane hydrates (a form of natural gas) on the continental shelves. Laboratory experiments carried out in Australia have proven that bubbles can, indeed, sink a scale model ship by decreasing the density of the water, and any wreckage would be deposited on the ocean floor or rapidly dispersed by the Gulf Stream. It has been hypothesized that periodic methane eruptions (sometimes called "mud volcanoes") may produce regions of frothy water that are no longer capable of providing adequate buoyancy for ships. If this were the case, such an area forming around a ship could cause it to sink very rapidly and without warning. Publications by the USGS describe large stores of undersea hydrates worldwide, including the Blake Ridge area, off the coast of the southeastern United States. However, according to the USGS, no large releases of gas hydrates are believed to have occurred in the Bermuda Triangle for the past 15,000 years. Notable incidents. HMS "Atalanta". The sail training ship HMS "Atalanta" (originally named HMS "Juno") disappeared with her entire crew after setting sail from the Royal Naval Dockyard, Bermuda for Falmouth, England on 31 January 1880. It was presumed that she sank in a powerful storm which crossed her route a couple of weeks after she sailed, and that her crew being composed primarily of inexperienced trainees may have been a contributing factor. The search for evidence of her fate attracted worldwide attention at the time (connection is also often made to the 1878 loss of the training ship HMS "Eurydice", which foundered after departing the Royal Naval Dockyard in Bermuda for Portsmouth on 6 March), and she was alleged decades later to have been a victim of the mysterious triangle, an allegation resoundingly refuted by the research of author David Francis Raine in 1997. USS "Cyclops". The incident resulting in the single largest loss of life in the history of the US Navy not related to combat occurred when the collier "Cyclops", carrying a full load of manganese ore and with one engine out of action, went missing without a trace with a crew of 306 sometime after 4 March 1918, after departing the island of Barbados. Although there is no strong evidence for any single theory, many independent theories exist, some blaming storms, some capsizing, and some suggesting that wartime enemy activity was to blame for the loss. In addition, two of "Cyclops"s sister ships, and , were subsequently lost in the North Atlantic during World War II. Both ships were transporting heavy loads of metallic ore similar to that which was loaded on "Cyclops" during her fatal voyage. In all three cases structural failure due to overloading with a much denser cargo than designed is considered the most likely cause of sinking. "Carroll A. Deering". "Carroll A. Deering", a five-masted schooner built in 1919, was found hard aground and abandoned at Diamond Shoals, near Cape Hatteras, North Carolina, on 31 January 1921. FBI investigation into the "Deering" scrutinized, then ruled out, multiple theories as to why and how the ship was abandoned, including piracy, domestic Communist sabotage and the involvement of rum-runners. Flight 19. Flight 19 was a training flight of five TBM Avenger torpedo bombers that disappeared on 5 December 1945, while over the Atlantic. The squadron's flight plan was scheduled to take them due east from Fort Lauderdale for , north for , and then back over a final leg to complete the exercise. The flight never returned to base. The disappearance was attributed by Navy investigators to navigational error leading to the aircraft running out of fuel. One of the search and rescue aircraft deployed to look for them, a PBM Mariner with a 13-man crew, also disappeared. A tanker off the coast of Florida reported seeing an explosion and observing a widespread oil slick when fruitlessly searching for survivors. The weather was becoming stormy by the end of the incident. According to contemporaneous sources, the Mariner had a history of explosions due to vapor leaks when heavily loaded with fuel, as it might have been for a potentially long search-and-rescue operation. "Star Tiger" and "Star Ariel". G-AHNP "Star Tiger" disappeared on 30 January 1948, on a flight from the Azores to Bermuda; G-AGRE "Star Ariel" disappeared on 17 January 1949, on a flight from Bermuda to Kingston, Jamaica. Both were Avro Tudor IV passenger aircraft operated by British South American Airways. Both planes were operating at the very limits of their range and the slightest error or fault in the equipment could keep them from reaching the small island. Douglas DC-3. On 28 December 1948, a Douglas DC-3 aircraft, number NC16002, disappeared while on a flight from San Juan, Puerto Rico, to Miami. No trace of the aircraft, or the 32 people on board, was ever found. A Civil Aeronautics Board investigation found there was insufficient information available on which to determine probable cause of the disappearance. "Connemara IV". A pleasure yacht was found adrift in the Atlantic south of Bermuda on 26 September 1955; it is usually stated in the stories (Berlitz, Winer) that the crew vanished while the yacht survived being at sea during three hurricanes. The 1955 Atlantic hurricane season shows Hurricane Ione passing nearby between 14 and 18 September, with Bermuda being affected by winds of almost gale force. In his second book on the Bermuda Triangle, Winer quoted from a letter he had received from Mr J.E. Challenor of Barbados: KC-135 Stratotankers. On 28 August 1963, a pair of US Air Force KC-135 Stratotanker aircraft collided and crashed into the Atlantic west of Bermuda. Some writers say that while the two aircraft did collide, there were two distinct crash sites, separated by over of water. However, Kusche's research showed that the unclassified version of the Air Force investigation report revealed that the debris field defining the second "crash site" was examined by a search and rescue ship, and found to be a mass of seaweed and driftwood tangled in an old buoy. References. Bibliography. The incidents cited above, apart from the official documentation, come from the following works. Some incidents mentioned as having taken place within the Triangle are found "only" in these sources: Further reading. Newspaper articles. ProQuest has newspaper source material for many incidents, archived in Portable Document Format (PDF). The newspapers include "The New York Times", "The Washington Post", and "The Atlanta Constitution". To access this website, registration is required, usually through a library connected to a college or university. Website links. The following websites have either online material that supports the popular version of the Bermuda Triangle, or documents published from official sources as part of hearings or inquiries, such as those conducted by the United States Navy or United States Coast Guard. Copies of some inquiries are not online and may have to be ordered; for example, the losses of Flight 19 or USS Cyclops can be ordered direct from the United States Naval Historical Center. Books. Most of the works listed here are largely out of print. Copies may be obtained at your local library, or purchased used at bookstores, or through eBay or Amazon.com. These books are often the "only" source material for some of the incidents that have taken place within the Triangle.
4856
1300142409
https://en.wikipedia.org/wiki?curid=4856
Borough
A borough is an administrative division in various English-speaking countries. In principle, the term "borough" designates a self-governing walled town, although in practice, official use of the term varies widely. History. In the Middle Ages, boroughs were settlements in England that were granted some self-government; burghs were the Scottish equivalent. In medieval England, boroughs were also entitled to elect members of parliament. The use of the word "borough" probably derives from the burghal system of Alfred the Great. Alfred set up a system of defensive strong points (Burhs); in order to maintain these particular settlements, he granted them a degree of autonomy. After the Norman Conquest, when certain towns were granted self-governance, the concept of the burh/borough seems to have been reused to mean a self-governing settlement. The concept of the borough has been used repeatedly (and often differently) throughout the world. Often, a borough is a single town with its own local government. However, in some cities it is a subdivision of the city (for example, New York City, London, and Montreal). In such cases, the borough will normally have either limited powers delegated to it by the city's local government, or no powers at all. In other places, such as the U.S. state of Alaska, "borough" designates a whole region; Alaska's largest borough, the North Slope Borough, is comparable in area to the entire United Kingdom, although its population is less than that of Swanage on England's south coast with around 9,600 inhabitants. In Australia, a "borough" was once a self-governing small town, but this designation has all but vanished, except for the only remaining borough in the country, which is the Borough of Queenscliffe. Boroughs as administrative units are to be found in Ireland and the United Kingdom, more specifically in England and Northern Ireland. Boroughs also exist in the Canadian province of Quebec and formerly in Ontario, in some states of the United States, in Israel, formerly in New Zealand and only one left in Australia. Etymology. The word "borough" derives from the Old English word , , meaning a fortified settlement; the word appears as modern English "bury", "-brough", Scots , in Scandinavian languages, in Dutch and in German. A number of other European languages have cognate words that were borrowed from the Germanic languages during the Middle Ages, including in Irish, or , meaning "wall, rampart" in Welsh, in French, in Catalan (in Catalonia there is a town named "Burg"), in Italian, in Portuguese, Galician and Castilian (hence the castilian place-name , galician place-names and ), the of and in Polish and the of in Slovenian. The 'burg' element, which means "castle" or "fortress", is often confused with 'berg' meaning "hill" or "mountain" (cf. iceberg, inselberg). Hence the 'berg' element in Bergen or Heidelberg relates to a hill, rather than a fort. In some cases, the 'berg' element in place names has converged towards burg/borough; for instance Farnborough, from "fernaberga" (fern-hill). A quite distinct word "borough", a variant of "borgh" or "borow", was sometimes used historically in south-east England (the county of Kent, and parts of Surrey and Sussex), to mean what was elsewhere called a "tithing", a subdivision of a manor or civil parish. The head of such a unit might be known as a "headborough". This usage is not to be confused with the more widespread meaning of "borough" as a privileged town. Definitions. Australia. In Australia, the term "borough" is an occasionally used term for a local government area. Currently there is only one borough in Australia, the Borough of Queenscliffe in Victoria, although there have been more in the past. However, in some cases it can be integrated into the council's name instead of used as an official title, such as the Kingborough Council in Tasmania. Canada. In Quebec, the term borough is generally used as the English translation of , referring to an administrative division of a municipality, or a district. Eight municipalities are divided into boroughs: See List of boroughs in Quebec. In Ontario, it was previously used to denote suburban municipalities in Metropolitan Toronto, including Scarborough, York, North York and Etobicoke prior to their conversions to cities. The Borough of East York was the last Toronto municipality to hold this status, relinquishing it upon becoming part of the City of Toronto government on January 1, 1998. Colombia. The Colombian municipalities are subdivided into boroughs (English translation of the Spanish term ) with a local executive and an administrative board for local government. These boroughs are divided into neighborhoods. Ireland. There are four borough districts designated by the Local Government Reform Act 2014: Clonmel, Drogheda, Sligo, and Wexford. A local boundary review reporting in 2018 proposed granting borough status to any district containing a census town with a population over 30,000; this would have included the towns of Dundalk, Bray, and Navan. This would have required an amendment to the 2014 Act, promised for 2019 by minister John Paul Phelan. Historically, there were 117 parliamentary boroughs in the Irish House of Commons, of which 80 were disfranchised by the Acts of Union 1800. All but 11 municipal boroughs were abolished under the Municipal Corporations (Ireland) Act 1840. Under the Local Government (Ireland) Act 1898, six of these became county boroughs: Dublin, Belfast, Cork, Derry, Limerick and Waterford. From 1921, Belfast and Derry were part of Northern Ireland and stayed within the United Kingdom on the establishment of the Irish Free State in 1922. Galway was a borough from 1937 until upgraded to a county borough in 1985. The county boroughs in the Republic of Ireland were redesignated as "cities" under the Local Government Act 2001. Dún Laoghaire was a borough from 1930 until merged into Dún Laoghaire–Rathdown county in 1994. There were five borough councils in place at the time of the Local Government Reform Act 2014 which abolished all second-tier local government units of borough and town councils. Each local government authority outside of Dublin, Cork City and Galway City was divided into areas termed municipal districts. In four of the areas which had previously been contained borough councils, as listed above, these were instead termed Borough Districts. Kilkenny had previously had a borough council, but its district was to be called the Municipal District of Kilkenny City, in recognition of its historic city status. Israel. Under Israeli law, inherited from British Mandate municipal law, the possibility of creating a municipal borough exists. However, no borough was actually created under law until 2005–2006, when Neve Monosson and Maccabim-Re'ut, both communal settlements (Heb: yishuv kehilati) founded in 1953 and 1984, respectively, were declared to be autonomous municipal boroughs (Heb: vaad rova ironi), within their mergers with the towns of Yehud and Modi'in. Similar structures have been created under different types of legal status over the years in Israel, notably Kiryat Haim in Haifa, Jaffa in Tel Aviv-Yafo and Ramot and Gilo in Jerusalem. However, Neve Monosson is the first example of a full municipal borough actually declared under law by the Minister of the Interior, under a model subsequently adopted in Maccabim-Re'ut as well. Mexico. In Mexico as translations from English to Spanish applied to Mexico City, the word "borough" has resulted in a delegación (delegation), referring to the 16 administrative areas within Mexico City, now called alcaldías. New Zealand. New Zealand formerly used the term borough to designate self-governing towns of more than 1,000 people, although 19th century census records show many boroughs with populations as low as 200. A borough of more than 20,000 people could become a city by proclamation. Boroughs and cities were collectively known as municipalities, and were enclaves separate from their surrounding counties. Boroughs proliferated in the suburban areas of the larger cities: By the 1980s there were 19 boroughs and three cities in the area that is now the City of Auckland. In the 1980s, some boroughs and cities began to be merged with their surrounding counties to form districts with a mixed urban and rural population. A nationwide reform of local government in 1989 completed the process. Counties and boroughs were abolished and all boundaries were redrawn. Under the new system, most territorial authorities cover both urban and rural land. The more populated councils are classified as cities, and the more rural councils are classified as districts. Only Kawerau District, an enclave within Whakatāne District, continues to follow the tradition of a small town council that does not include surrounding rural area. Trinidad and Tobago. In Trinidad and Tobago, a Borough is a unit of Local Government. There are 5 boroughs in The Republic of Trinidad and Tobago: United Kingdom. England and Wales. Ancient and municipal boroughs. During the medieval period many towns were granted self-governance by the Crown, at which point they became referred to as boroughs. The formal status of borough came to be conferred by Royal Charter. These boroughs were generally governed by a self-selecting corporation (i.e., when a member died or resigned his replacement would be by co-option). Sometimes boroughs were governed by bailiffs. Debates on the Reform Bill (eventually the Reform Act 1832) lamented the diversity of polity of such town corporations, and a Royal Commission was set up to investigate this. This resulted in a regularisation of municipal government by the Municipal Corporations Act 1835. 178 of the ancient boroughs were re-formed as "municipal boroughs", with all municipal corporations to be elected according to a standard franchise based on property ownership. The unreformed boroughs lapsed in borough status, or were reformed (or abolished) later. Several new municipal boroughs were formed in the new industrial cities after the bill enacted, per its provisions. As part of a large-scale reform of local government in England and Wales in 1974, municipal boroughs were finally abolished (having become increasingly irrelevant). However, the civic traditions of many were continued by the grant of a charter to their successor district councils. As to smallest boroughs, a town council was formed for an alike zone, while charter trustees were formed for a few others. A successor body is allowed to use the regalia of the old corporation, and appoint ceremonial office holders such as sword and mace bearers as provided in their original charters. The council, or trustees, may apply for an Order in Council or Royal Licence to use the coat of arms. Parliamentary boroughs. From 1265, two burgesses from each borough were summoned to the Parliament of England, alongside two knights from each county. Thus parliamentary constituencies were derived from the ancient boroughs. Representation in the House of Commons was decided by the House itself, which resulted in boroughs being established in some small settlements for the purposes of parliamentary representation, despite their possessing no actual corporation. After the 1832 Reform Act, which disenfranchised many of the rotten boroughs (boroughs that had declined in importance, had only a small population, and had only a handful of eligible voters), parliamentary constituencies began to diverge from the ancient boroughs. While many ancient boroughs remained as municipal boroughs, they were disenfranchised by the Reform Act. County boroughs. The Local Government Act 1888 established a new sort of borough – the county borough. These were designed to be 'counties-to-themselves'; administrative divisions to sit alongside the new administrative counties. They allowed urban areas to be administered separately from the more rural areas. They, therefore, often contained pre-existing municipal boroughs, which thereafter became part of the second tier of local government, below the administrative counties and county boroughs. The county boroughs were, like the municipal boroughs, abolished in 1974, being reabsorbed into their parent counties for administrative purposes. Metropolitan boroughs. In 1899, as part of a reform of local government in the County of London, the various parishes in London were reorganised as new entities, the 'metropolitan boroughs'. These were reorganised further when Greater London was formed out of Middlesex, parts of Surrey, Kent, Essex, Hertfordshire and the County of London in 1965. These council areas are now referred to as "London boroughs" rather than "metropolitan boroughs". When the new metropolitan counties (Greater Manchester, Merseyside, South Yorkshire, Tyne and Wear, West Midlands, and West Yorkshire) were created in 1974, their sub-divisions also became metropolitan boroughs in many, but not all, cases; in many cases these metropolitan boroughs recapitulated abolished county boroughs (for example, Stockport). The metropolitan boroughs possessed slightly more autonomy from the metropolitan county councils than the shire county districts did from their county councils. With the abolition of the metropolitan county councils in 1986, these metropolitan boroughs became independent, and continue to be so at present. Other current uses. Elsewhere in England a number of districts and unitary authority areas are called "borough". Until 1974, this was a status that denoted towns with a certain type of local government (a municipal corporation, or a self-governing body). Since 1974, it has been a purely ceremonial style granted by royal charter to districts which may consist of a single town or may include a number of towns or rural areas. Borough status entitles the council chairman to bear the title of mayor. Districts may apply to the British Crown for the grant of borough status upon advice of the Privy Council of the United Kingdom. Northern Ireland. In Northern Ireland, local government was reorganised in 1973. Under the legislation that created the 26 districts of Northern Ireland, a district council whose area included an existing municipal borough could resolve to adopt the charter of the old municipality and thus continue to enjoy borough status. Districts that do not contain a former borough can apply for a charter in a similar manner to English districts. United States. In the United States, a borough is a unit of local government or other administrative division below the level of the state. The term is currently used in seven states. The following states use, or have used, the word with the following meanings:
4858
27015025
https://en.wikipedia.org/wiki?curid=4858
Bodmin
Bodmin () is a town and civil parish in Cornwall, England, United Kingdom. It is situated south-west of Bodmin Moor. The extent of the civil parish corresponds fairly closely to that of the town so is mostly urban in character. It is bordered to the east by Cardinham parish, to the southeast by Lanhydrock parish, to the southwest and west by Lanivet parish, and to the north by Helland parish. Bodmin had a population of 14,736 as of the 2011 Census. It was formerly the county town of Cornwall until the Crown Courts moved to Truro which is also the administrative centre (before 1835 the county town was Launceston). Bodmin was in the administrative North Cornwall District until local government reorganisation in 2009 abolished the District ("see also Cornwall Council"). The town is part of the North Cornwall parliamentary constituency, which is represented by Ben Maguire MP. Bodmin Town Council is made up of sixteen councillors who are elected to serve a term of four years. Each year, the Council elects one of its number as Mayor to serve as the town's civic leader and to chair council meetings. Situation and origin of the name. The name of the town probably derives from the Cornish "Bod-meneghy", meaning "dwelling of or by the sanctuary of monks". Variant spellings recorded include "Botmenei" in 1100, "Bodmen" in 1253, "Bodman" in 1377 and "Bodmyn" in 1522. The "Bodman" spelling also appears in sources and maps from the 16th and 17th centuries, most notably in the celebrated map of Cornwall produced by John Speed but actually engraved by the Dutch cartographer Jodocus Hondius the Elder (1563–1612) in Amsterdam in 1610 (published in London by Sudbury and Humble in 1626). The hamlets of Cooksland, Little Kirland, Dunmere and Turfdown are in the parish. History. St. Petroc founded a monastery in Bodmin in the 6th century and gave the town its alternative name of "Petrockstow". The monastery was deprived of some of its lands at the Norman Conquest but at the time of Domesday still held eighteen manors, including Bodmin, Padstow and Rialton. Bodmin is one of the oldest towns in Cornwall, and the only large Cornish settlement recorded in the Domesday Book in 1086. In the 15th century the Norman church of St Petroc was largely rebuilt and stands as one of the largest churches in Cornwall (the largest after the cathedral at Truro). Also built at that time was an abbey of canons regular, now mostly ruined. For most of Bodmin's history, the tin industry was a mainstay of the economy. An inscription on a stone built into the wall of a summer house in Lancarffe furnishes proof of a settlement in Bodmin in the early Middle Ages. It is a memorial to one "Duno[.]atus son of Me[.]cagnus" and has been dated from the 6th to 8th centuries. Arthur Langdon (1896) records three Cornish crosses at Bodmin; one was near the Berry Tower, one was outside Bodmin Gaol and another was in a field near Castle Street Hill. There is also Carminow Cross at a road junction southeast of the town. The Black Death killed half of Bodmin's population in the mid 14th century (1,500 people). The local seat of government was the Bodmin Guildhall in Fore Street. Rebellions. Bodmin was the centre of three Cornish uprisings. The first was the Cornish Rebellion of 1497 when a Cornish army, led by Michael An Gof, a blacksmith from St. Keverne and Thomas Flamank, a lawyer from Bodmin, marched to Blackheath in London where they were eventually defeated by 10,000 men of the King's army under Baron Daubeny. Then, in the autumn of 1497, Perkin Warbeck tried to usurp the throne from Henry VII. Warbeck was proclaimed King Richard IV in Bodmin but Henry had little difficulty crushing the uprising. In 1549, Cornishmen, allied with other rebels in neighbouring Devon, rose once again in rebellion when the staunchly Protestant Edward VI tried to impose a new Prayer Book. The lower classes of Cornwall and Devon were still strongly attached to the Roman Catholic religion and again a Cornish army was formed in Bodmin which marched across the border into Devon to lay siege to Exeter. This became known as the Prayer Book Rebellion. Proposals to translate the Prayer Book into Cornish were suppressed and in total 4,000 people were killed in the rebellion. Bodmin Borough Police. The Borough of Bodmin was one of the 178 municipal boroughs which under the auspices of the Municipal Corporations Act 1835 was mandated to create an electable council and a Police Watch Committee responsible for overseeing a police force in the town. The new system directly replaced the Parish Constables that had policed the borough since time immemorial and brought paid, uniformed and accountable law enforcement for the first time. Bodmin Borough Police was the municipal police force for the Borough of Bodmin from 1836 to 1866. The creation of the Cornwall Constabulary in 1857 put pressure on smaller municipal police forces to merge with the county. The two-man force of Bodmin came under threat almost immediately, but it would take until 1866 for the Mayor of Bodmin and the Chairman of the Police Watch Committee to agree on the terms of amalgamation. After a public enquiry, the force was disbanded in January 1866 and policing of the borough was deferred to the county from thereon. "Bodmin Town". The song "Bodmin Town" was collected from the Cornishman William Nichols at Whitchurch, Devon, in 1891 by Sabine Baring-Gould who published a version in his "A Garland of Country Song" (1924). Churches. Parish church of St Petroc. The existing church building is dated 1469–72 and was until the building of Truro Cathedral the largest church in Cornwall. The tower which remains from the original Norman church and stands on the north side of the church (the upper part is 15th-century) was, until the loss of its spire in 1699, high. The building underwent two Victorian restorations and another in 1930. It is now listed Grade I. There are a number of interesting monuments, most notably the black Delabole slate memorial to Richard Durant, his wives and twenty children, carved in low relief, and that of Prior Vivian which was formerly in the Priory Church (Thomas Vivian's effigy lying on a chest, all in black Catacleuse stone). There is also a twelfth-century ivory casket which is thought to have once contained relics of St Petroc. The font of a type common in Cornwall is of the 12th century: large and finely carved in elvan. Other churches. The Chapel of St Thomas Becket is a ruin of a 14th-century building in Bodmin churchyard. The holy well of St Guron is a small stone building at the churchyard gate. The Berry Tower is all that remains of the former church of the Holy Rood and there are even fewer remains from the substantial Franciscan Friary established ca. 1240: a gateway in Fore Street and two pillars elsewhere in the town. The Roman Catholic Abbey of St Mary and St Petroc, formerly belonging to the Canons Regular of the Lateran was built in 1965 next to the already existing seminary. The Roman Catholic parish of Bodmin includes a large area of North Cornwall and there are churches also at Wadebridge, Padstow and Tintagel (St Paul's Church, Tintagel). In 1881 the Roman Catholic mass was celebrated in Bodmin for the first time since 1539. A church was planned in the 1930s but delayed by the Second World War: the Church of St Mary and St Petroc was eventually consecrated in 1965: it was built next to the already existing seminary. There are also five other churches in Bodmin, including a Methodist church. Sites of interest. Bodmin Jail. Bodmin Jail, operational for over 150 years but now a semi-ruin, was built in the late 18th century, and was the first British prison to hold prisoners in separate cells (though often up to ten at a time) rather than communally. Over fifty prisoners condemned at the Bodmin Assize Court were hanged at the prison. It was also used for temporarily holding prisoners sentenced to transportation, awaiting transfer to the prison hulks lying in the highest navigable reaches of the River Fowey. Also, during 1918–19 in the First World War the prison held some material from Britain's Public Record Office, including the Domesday Book, but not the Crown Jewels as is commonly claimed: in World War II these were stored in Windsor Castle. Institutions. Other buildings of interest include the former Shire Hall, now a tourist information centre, and Victoria Barracks, formerly depot of the now defunct Duke of Cornwall's Light Infantry and now the site of the regimental museum. It includes the history of the regiment from 1702, plus a military library. The original barracks house the regimental museum which was founded in 1925. There is a fine collection of small arms and machine guns, plus maps, uniforms and paintings on display. The Honey Street drill hall was the mobilisation point for reservists being deployed to serve on the Western Front. Bodmin County Lunatic Asylum, later known as St Lawrence's Hospital, was designed by John Foulston. The humorist, William Robert Hicks, was domestic superintendent in the mid-19th century. Walker Lines, named after Lieutenant-General Harold Walker, was a Second World War camp built as an extension to the DCLI Barracks. It was used to harbour men evacuated from Dunkirk and later to house troops for the D-Day landings. In the 1950s it was the site of the JSSL. The site is now an industrial estate but still known as 'Walker Lines'. Bosvenna House, an Edwardian manor house, was formerly Bosvenna Hotel, and the home of the Royal British Legion Club, but has since become a private residence. There is a sizable single storey Masonic Hall in St Nicholas Street, which is home to no less than eight Masonic bodies. Other sites. Bodmin Beacon Local Nature Reserve is the hill overlooking the town. The reserve has of public land and at its highest point it reaches with the distinctive landmark at the summit. The tall granite monument to Sir Walter Raleigh Gilbert was built in 1857 by the townspeople of Bodmin to honour the soldier's life and work in India. In 1966, the "Finn VC Estate" was named in honour of Victoria Cross winner James Henry Finn who once lived in the town. An ornate granite drinking bowl which serves the needs of thirsty dogs at the entrance to Bodmin's Priory car park was donated by Prince Chula Chakrabongse of Thailand who lived at Tredethy. Education. There are no independent schools in the area. Primary schools. Beacon ACE Academy opened as a primary school for pupils aged between 3–11 in September 2017, following the merger of Beacon Infant and Nursery School and Robartes Junior School. Beacon ACE Academy is part of Kernow Learning Multi Academy Trust and is rated Good by Ofsted. The school offers places for 420 pupils as well as 30 places within its Nursery and 10 places within its Area Resource Base for pupils with Special Educational Needs. St Petroc's voluntary aided Church of England Primary School, Athelstan Park, Bodmin, was given this title in September 1990 after the amalgamation of St. Petroc's Infant School and St. Petroc's Junior School. St. Petroc's is a large school with some 440 pupils between the ages of four and 11. Eight of its fourteen governors are nominated by the Diocese of Truro or the Parochial Church Council of St. Petroc's, Bodmin. It is currently rated as "Requires Improvement" by Ofsted. There are a further two primary schools within Bodmin; Berrycoombe School in the northwest corner of the town, and St. Mary's Catholic Primary School. Bodmin College. Bodmin College is a large state comprehensive school for ages 11–18 on the outskirts of the town. The college is home to Bodmin College Jazz Orchestra. In 1997, Systems & Control students at Bodmin College constructed Roadblock, a robot which entered and won the first series of "Robot Wars" and was succeeded by "The Beast of Bodmin." The school also has one of the largest sixth forms in the county. Callywith College. Callywith College is a further education college in Bodmin that opened in September 2017. A new-build college on a site close to the Asda supermarket, it will eventually cater for 1,280 students, with 197 staff employed. A total of 660 places were available in its first year. It is being created with the assistance of Truro and Penwith College to serve students aged 16–19 from Bodmin, North Cornwall and East Cornwall. It received the go-ahead in February 2016, funded as a Free School. Army School of Education. Aspiring National Service Sergeant Instructors of the Royal Army Education Corps underwent training at the Army School of Education, situated at the end of the Second World War at Buchanan Castle, Drymen in Scotland, and later, from 1948, at the Walker Lines, Bodmin, until it moved to Wilton Park, Beaconsfield. Transport. Bodmin Parkway railway station – once known as Bodmin Road – is a principal calling point on the Cornish Main Line about south-east of the town centre. Buses to central Bodmin, Wadebridge, Padstow, Rock, Polzeath, Port Isaac and Camelford depart from outside the station entrance. It is connected to Bodmin town by a branch line that is home to the local steam railway, Bodmin and Wenford Railway. Bodmin is just off the A30 providing a connection to the M5 motorway at Exeter, northeast. Bus and coach services connect Bodmin with some other districts of Cornwall and Devon. Sport and leisure. Bodmin has a non-league football club Bodmin Town playing in the South West Peninsula League; a level 10 league in the English football league system. Their home ground is at Priory Park. Bodmin Rugby Club play rugby union at Clifden Parc and compete in the Tribute Cornwall/Devon league; a level 8 league in the English rugby union system. The Royal Cornwall Golf Club (now defunct) was located on Bodmin Moor. It was founded in 1889 and became "Royal" in 1891. The club disbanded in the 1950s. There is an active running club, Bodmin RoadRunners. Bodmin was a stage finish in 2021 cycling Tour of Britain (Stage 1, 5 September). Cornish wrestling. Bodmin has been a great centre for Cornish wrestling over the centuries. The Bodmin Wrestling Association was instrumental in the setting up of the Cornish Wrestling Association in 1923. At the base of the monument on The Beacon are the remains of the wrestling ring which many believe was a Plen-an-gwary. More recently Cornish wrestling tournaments are held as part of the revival of Bodmin Riding. Other places in Bodmin where Cornish wrestling tournaments and matches were held include: William George Fish, known as "Billy the Fish", from Bodmin, was the featherweight champion in 1927 and 1928 and the lightweight champion in 1933 and 1934. Deprivation and crime. Some areas of the town have high levels of deprivation, and the proportion of children in poverty is higher than the average for Cornwall. The town is in the most deprived 20% on the Index of Multiple Deprivation, and a higher than average proportion of people living in the area have no qualifications. Bodmin has problems with drug-dealing. It is part of the county lines drug trafficking network. Cuckooing is an issue locally. Media. "Cornish Guardian" is a weekly newspaper published every Wednesday in seven separate editions, including the Bodmin edition. In October 2020, the "Bodmin Voice", sister paper to the "Newquay Voice", was launched. It is published every Wednesday and focuses centrally on Bodmin. Bodmin is the home of NCB Radio, an internet radio station which aims to bring a dedicated station to North Cornwall. The town is also served by county-wide radio stations, BBC Radio Cornwall, Heart West and Greatest Hits Radio South West. Local TV coverage is provided by BBC South West and ITV West Country. Television signals are received from the Caradon Hill and the local relay transmitters. Notable people. See also Town twinning. Bodmin is twinned with Bederkesa in Germany; Grass Valley, in California, United States; and Le Relecq-Kerhuon (Ar Releg-Kerhuon in Brittany), France. Official heraldry. W. H. Pascoe's 1979 "A Cornish Armory" gives the arms of the priory and the monastery and the seal of the borough. Official events. On Halgavor Moor (Goats' Moor) near Bodmin there was once an annual carnival in July which was on one occasion attended by King Charles II. Halgavor extends into the parish of Lanhydrock. The Cornish Games were once held on Halgavor Moor; the chief feature of these games was Cornish wrestling and Carew in his Survey (1602) gave an account of it. Bodmin Riding, a horseback procession through the town, is a traditional annual ceremony. 'Beating the bounds' and 'hurling'. In 1865–66 William Robert Hicks was mayor of Bodmin, when he revived the custom of beating the bounds of the town. He was – according to the Dictionary of National Biography – a very good man of business. This still takes place more or less every five years and concludes with a game of Cornish hurling. Hurling survives as a traditional part of beating the bounds at Bodmin, commencing at the close of the 'Beat'. The game is organised by the Rotary club of Bodmin and was last played in 2015. The game is started by the Mayor of Bodmin by throwing a silver ball into a body of water known as the "Salting Pool". There are no teams and the hurl follows a set route. The aim is to carry the ball from the "Salting Pool" via the old A30, along Callywith Road, then through Castle Street, Church Square and Honey Street to finish at the Turret Clock in Fore Street. The participant carrying the ball when it reaches the turret clock will receive a £10 reward from the mayor. In 2015, beating of the bounds and Cornish hurling took place at Bodmin 8 April organised by the Rotary club of Bodmin.
4859
38004052
https://en.wikipedia.org/wiki?curid=4859
Bodmin Moor
Bodmin Moor () is a granite moorland in north-eastern Cornwall, England, United Kingdom. It is in size, and dates from the Carboniferous period of geological history. It includes Brown Willy, the highest point in Cornwall, and Rough Tor, a slightly lower peak. Many of Cornwall's rivers have their sources here. It has been inhabited since at least the Neolithic era, when early farmers started clearing trees and farming the land. They left their megalithic monuments, hut circles and cairns, and the Bronze Age culture that followed left further cairns, and more stone circles and stone rows. By medieval and modern times, nearly all the forest was gone and livestock rearing predominated. The name Bodmin Moor is relatively recent. An early mention is in the "Royal Cornwall Gazette" of 28 November 1812. The upland area was formerly known as Fowey Moor after the River Fowey, which rises within it. Geology. Bodmin Moor is one of five granite plutons in Cornwall that make up part of the Cornubian batholith. The intrusion dates from the Cisuralian epoch, the earliest part of the Permian period, and outcrops across about 190 square km. Around the pluton's margins where it intruded into slates, the country rock has been hornfelsed. Numerous peat deposits occur across the moor whilst large areas are characterised by blockfields of granite boulders; both deposits are of Holocene age (see also Geology of Cornwall). Geography. Dramatic granite tors rise from the rolling moorland: the best known are Brown Willy, the highest point in Cornwall at , and Rough Tor at . To the south-east Kilmar Tor and Caradon Hill are the most prominent hills. Considerable areas of the moor are poorly drained and form marshes (in hot summers these can dry out). The rest of the moor is mostly rough pasture or covered with heather and other low vegetation. The moor contains about 500 holdings with around 10,000 beef cows, 55,000 breeding ewes and 1,000 horses and ponies. Most of the moor is a Site of Special Scientific Interest (SSSI), "Bodmin Moor, North", and has been designated an Area of Outstanding Natural Beauty (AONB), as part of Cornwall AONB. The moor has been identified by BirdLife International as an Important Bird Area (IBA) because it supports about 260 breeding pairs of European stonechats as well as a wintering population of 10,000 Eurasian golden plovers. The moor has also been recognised as a separate natural region and designated as national character area 153 by Natural England. Institutional landowners within "Bodmin Moor, North" SSSI include the National Trust, the Ministry of Defence, the Forestry Commission and Highways England. Rivers and inland waters. Bodmin Moor is the source of several of Cornwall's rivers: they are mentioned here anti-clockwise from the south. The River Fowey rises at a height of and flows through Lostwithiel and into the Fowey estuary. The River Tiddy rises near Pensilva and flows southeast to its confluence with the River Lynher (the Lynher flows generally south-east until it joins the Hamoaze near Plymouth). The River Inny rises near Davidstow and flows southeast to its confluence with the River Tamar. The River Camel rises on Hendraburnick Down and flows for approximately before joining the sea at Padstow. The River Camel and its tributary the De Lank River are an important habitat for the otter, and both have been proposed as Special Areas of Conservation (SAC) The De Lank River rises near Roughtor and flows along an irregular course before joining the Camel south of Wenford. The River Warleggan rises near Temple and flows south to join the Fowey. On the southern slopes of the moor lies Dozmary Pool. It is Cornwall's only natural inland lake and is glacial in origin. In the 20th century three reservoirs have been constructed on the moor; these are Colliford Lake, Siblyback Lake and Crowdy reservoirs, which supply water for a large part of the county's population. Various species of waterfowl are resident around these rivers. Parishes. The parishes on the moor are as follows: History and antiquities. Prehistoric times. 10,000 years ago, in the Mesolithic period, hunter-gatherers wandered the area when it was wooded. There are several documented cases of flint scatters being discovered by archaeologists, indicating that these hunter-gatherers practised flint knapping in the region. During the Neolithic era, from about 4,500 to 2,300 BC, people began clearing trees and farming the land. It was also in this era that the production of various megalithic monuments began, predominantly long cairns (three of which have currently been identified, at Louden, Catshole and Bearah) and stone circles (sixteen of which have been identified). It was also likely that the naturally forming tors were also viewed in a similar manner to the manmade ceremonial sites. In the following Bronze Age, the creation of monuments increased dramatically, with the production of over 300 further cairns, and more stone circles and stone rows. More than 200 Bronze Age settlements with enclosures and field patterns have been recorded. and many prehistoric stone barrows and circles lie scattered across the moor. In the late 1990s, a team of archaeologists and anthropologists from UCL researched the Bronze Age landscapes of Leskernick over several seasons (Barbara Bender; Sue Hamilton; Christopher Tilley and students). In a programme shown in 2007 Channel 4's "Time Team "investigated a 500-metre cairn and the site of a Bronze Age village on the slopes of Rough Tor. King Arthur's Hall, thought to be a late Neolithic or early Bronze Age ceremonial site, can be found to the east of St Breward on the moor. Medieval and modern times. Where practicable, areas of the moor were used for pasture by herdsmen from the parishes surrounding the moor. Granite boulders were also taken from the moor and used for stone posts and to a certain extent for building (such material is known as moorstone). Granite quarrying only became reasonably productive when gunpowder became available. The moor gave its name (Foweymore) to one of the medieval districts called stannaries which administered tin mining: the boundaries of these were never defined precisely. Until the establishment of a turnpike road through the moor (the present A30) in the 1770s the size of the moorland area made travel within Cornwall very difficult. Its Cornish name, Goen Bren, is first recorded in the 12th century. English Heritage monographs "Bodmin Moor: An Archaeological Survey" Volume 1 and Volume 2 covering the post-medieval and modern landscape are publicly available through the Archaeology Data Service. Jamaica Inn is a traditional inn on the Moor. Built as a coaching inn in 1750 and having an association with smuggling, it was used as a staging post for changing horses. In the 1980s, there was a big problem with the water supply in Camelford. Many people had medical issues after this and some died. Monuments and ruins. Roughtor was the site of a medieval chapel of St Michael and is now designated as a memorial to the 43rd Wessex Division of the British Army. In 1844 on Bodmin Moor the body of 18-year-old Charlotte Dymond was discovered. Local labourer Matthew Weeks was accused of the murder, and at noon on 12 August 1844 he was led from Bodmin Gaol and hanged. The murder site now has a monument erected from public money, and her grave is at Davidstow churchyard. Legends and traditions. Dozmary Pool is identified by some people with the lake in which, according to Arthurian legend, Sir Bedivere threw Excalibur to The Lady of the Lake. Another legend relating to the pool concerns Jan Tregeagle. The Beast of Bodmin has been reported many times but never identified with certainty. The Beast of Bodmin is an instance of sightings of a British big cat. Searches for physical "evidence" to support such a claim has typically been found to have far more ordinary and less sensational origins. In the case of the Beast of Bodmin, when a skull found in the River Fowey was presented to the Natural History Museum as proof of its existence, it was found to have been cut from a leopard skin rug. Film. "Cornish Cowboy", a 2014 short documentary film screened at the 2015 Cannes Film Festival, was shot on Bodmin Moor. The film features the work of St Neot horse trainer, Dan Wilson. Literature. British/Australian author Brand King sets much of his second novel, "A Cornish Spring" on Bodmin Moor. The novel evokes the ghost of murdered 19th century farmgirl Charlotte Dymond to drive its narrative. Her monument features on the book's cover.
4860
1300849745
https://en.wikipedia.org/wiki?curid=4860
Berkeley, California
Berkeley ( ) is a city on the eastern shore of San Francisco Bay in northern Alameda County, California, United States. It is named after the 18th-century Anglo-Irish bishop and philosopher George Berkeley. It borders the cities of Oakland and Emeryville to the south and the city of Albany and the unincorporated community of Kensington to the north. Its eastern border with Contra Costa County generally follows the ridge of the Berkeley Hills. The 2020 census recorded a population of 124,321. Berkeley is home to the oldest campus in the University of California, the University of California, Berkeley, and the Lawrence Berkeley National Laboratory, which is managed and operated by the university. It also has the Graduate Theological Union, one of the largest religious studies institutions in the world. Berkeley is considered one of the most socially progressive cities in the United States. History. Indigenous history. The site of today's City of Berkeley was the territory of the Chochenyo/Huchiun Ohlone people when the first Europeans arrived. Evidence of their existence in the area include pits in rock formations, which they used to grind acorns, wildflower seeds, grass seeds, and many different foods, including squirrel fat, and a shellmound, now mostly leveled and covered up, along the shoreline of San Francisco Bay at the mouth of Strawberry Creek. Human remains and skeletons from Native American burials have been unearthed in West Berkeley and on campus alongside Strawberry Creek. Other artifacts were discovered in the 1950s in the downtown area during remodeling of a commercial building, near the upper course of the creek. Spanish and Mexican eras. The first people of European descent (most of whom were of mixed race and born in America) arrived with the De Anza Expedition in 1776. The De Anza Expedition led to establishment of the Spanish Presidio of San Francisco at the entrance to San Francisco Bay (the "Golden Gate)." Luis Peralta was among the soldiers at the Presidio. For his services to the King of Spain, he was granted a vast stretch of land on the east shore of San Francisco Bay (the "contra costa", "opposite shore") for a ranch, including that portion that now comprises the City of Berkeley. Luis Peralta named his holding "Rancho San Antonio." The primary activity of the ranch was raising cattle for meat and hides, but hunting and farming were also pursued. Eventually, Peralta gave portions of the ranch to each of his four sons. What is now Berkeley lies mostly in the portion that went to Peralta's son Domingo, with a little in the portion that went to another son, Vicente. No artifact survives of the Domingo or Vicente ranches, but their names survive in Berkeley street names (Vicente, Domingo, and Peralta). However, legal title to all land in the City of Berkeley remains based on the original Peralta land grant. The Peraltas' Rancho San Antonio continued after Alta California passed from Spanish to Mexican sovereignty after the Mexican War of Independence. However, the advent of U.S. sovereignty after the Mexican–American War, and especially, the gold rush, saw the Peraltas' lands quickly encroached on by squatters and diminished by dubious legal proceedings. The lands of the brothers Domingo and Vicente were quickly reduced to reservations close to their respective ranch homes. The rest of the land was surveyed and parceled out to various American claimants ("See" Kellersberger's Map). Politically, the area that became Berkeley was initially part of a vast Contra Costa County. On March 25, 1853, Alameda County was created from a division of Contra Costa County, as well as from a small portion of Santa Clara County. The area that became Berkeley was then the northern part of the "Oakland Township" subdivision of Alameda County. During this period, "Berkeley" was mostly a mix of open land, farms, and ranches, with a small, though busy, wharf by the bay. Late 19th century. In 1866, Oakland's private College of California looked for a new site. It settled on a location north of Oakland along the foot of the Contra Costa Range (later called the Berkeley Hills) on Strawberry Creek, at an elevation of about above the bay, commanding a view of the Bay Area and the Pacific Ocean through the Golden Gate. According to the "Centennial Record of the University of California", "In 1866, at Founders' Rock, a group of College of California men watched two ships standing out to sea through the Golden Gate. One of them, Frederick Billings, thought of the lines of the Anglo-Irish Anglican Bishop George Berkeley, 'westward the course of empire takes its way,' and suggested that the town and college site be named for the eighteenth-century Anglo-Irish philosopher." The philosopher's name is pronounced "BARK-lee", but the city's name, to accommodate American English, is pronounced "BERK-lee". The College of California's College Homestead Association planned to raise funds for the new campus by selling off adjacent parcels of land. To this end, they laid out a plat and street grid that became the basis of Berkeley's modern street plan. Their plans fell far short of their desires, and they began a collaboration with the State of California that culminated in 1868 with the creation of the public University of California. As construction began on the new site, more residences were constructed in the vicinity of the new campus. At the same time, a settlement of residences, saloons, and various industries grew around the wharf area called Ocean View. A horsecar ran from Temescal in Oakland to the university campus along what is now Telegraph Avenue. The first post office opened in 1872. By the 1870s, the transcontinental railroad reached its terminus in Oakland. In 1876, a branch line of the Central Pacific Railroad, the Berkeley Branch Railroad, was laid from a junction with the mainline called Shellmound (now a part of Emeryville) into what is now downtown Berkeley. That same year, the mainline of the transcontinental railroad into Oakland was re-routed, putting the right-of-way along the bay shore through Ocean View. There was a strong prohibition movement in Berkeley at this time. In 1876, the state enacted the "mile limit law", which forbade sale or public consumption of alcohol within of the new University of California. Then, in 1899, Berkeley residents voted to make their city an alcohol-free zone. Scientists, scholars and religious leaders spoke vehemently of the dangers of alcohol. On April 1, 1878, the people of Ocean View and the area around the university campus, together with local farmers, were granted incorporation by the State of California as the Town of Berkeley. The first elected trustees of the town were the slate of Denis Kearney's anti-Chinese Workingman's Party, who were particularly favored in the working-class area of the former Ocean View, now called West Berkeley. During the 1880s Berkeley had segregated housing and anti-Chinese laws. The area near the university became known for a time as East Berkeley. Due to the influence of the university, the modern age came quickly to Berkeley. Electric lights and the telephone were in use by 1888. Electric streetcars soon replaced the horsecar. A silent film of one of these early streetcars in Berkeley can be seen at the Library of Congress website. Early 20th century. Berkeley's slow growth ended abruptly with the Great San Francisco earthquake of 1906. The town and other parts of the East Bay escaped serious damage, and thousands of refugees flowed across the Bay. Among them were most of San Francisco's painters and sculptors, who between 1907 and 1911 created one of the largest art colonies west of Chicago. Artist and critic Jennie V. Cannon described the founding of the Berkeley Art Association and the rivalries of competing studios and art clubs. In 1904, the first hospitals in Berkeley were created: the Alta Bates Sanatorium (today Alta Bates Summit Medical Center) for women and children, founded by nurse Alta Bates on Walnut Street, and the Roosevelt Hospital (later Herrick Hospital), founded by LeRoy Francis Herrick, on the corner of Dwight Way and Milvia Street. In 1908, a statewide referendum that proposed moving the California state capital to Berkeley was defeated by a margin of about 33,000 votes. The city had named streets around the proposed capitol grounds for California counties. They bear those names today, a legacy of the failed referendum. On March 4, 1909, following public referendums, the citizens of Berkeley were granted a new charter by the State of California, and the Town of Berkeley became the City of Berkeley. Rapid growth continued up to the crash of 1929. The Great Depression hit Berkeley hard, but not as hard as many other places in the U.S., thanks in part to the university. In 1916, Berkeley implemented single-family zoning as an effort to keep minorities out of white neighborhoods. This has been described as the first implementation of single-family zoning in the United States By 2021, nearly half of Berkeley's residential neighborhoods were still exclusively zoned for single-family homes. On September 17, 1923, a major fire swept down the hills toward the university campus and the downtown section. Around 640 structures burned before a late-afternoon sea breeze stopped its progress, allowing firefighters to put it out. The next big growth occurred with the advent of World War II, when large numbers of people moved to the Bay Area to work in the many war industries, such as the immense Kaiser Shipyards in nearby Richmond. One who moved out, but played a big role in the outcome of the war, was U.C. professor and Berkeley resident J. Robert Oppenheimer. During the war, an Army base, Camp Ashby, was temporarily sited in Berkeley. The element berkelium was synthesized utilizing the cyclotron at UC Berkeley, and named in 1949, in recognition of the university, thus placing the city's name in the list of elements. 1940–60s. During the 1940s, many African Americans migrated to Berkeley. In 1950, the Census Bureau reported Berkeley's population as 11.7% black and 84.6% white. The postwar years brought moderate growth to the city, as events on the U.C. campus began to build up to the recognizable activism of the sixties. In the 1950s, McCarthyism induced the university to demand a loyalty oath from its professors, many of whom refused to sign the oath on the principle of freedom of thought. In 1960, a U.S. House committee (HUAC) came to San Francisco to investigate the influence of communists in the Bay Area. Their presence was met by protesters, including many from the university. Meanwhile, a number of U.C. students became active in the civil rights movement. Finally, in 1964, the university provoked a massive student protest by banning distribution of political literature on campus. This protest became the Free Speech Movement. As the Vietnam War rapidly escalated in the ensuing years, so did student activism at the university, particularly that organized by the Vietnam Day Committee. Berkeley is strongly identified with the rapid social changes, civic unrest, and political upheaval that characterized the mid-to-late 1960s. In that period, Berkeley—especially Telegraph Avenue—became a focal point for the hippie movement, which spilled over the Bay from San Francisco. Many hippies were apolitical drop-outs, rather than students, but in the heady atmosphere of Berkeley in 1967–1969 there was considerable overlap between the hippie movement and the radical left. An iconic event in the Berkeley Sixties scene was a conflict over a parcel of university property south of the contiguous campus site that came to be called "People's Park". The battle over the disposition of People's Park resulted in a month-long occupation of Berkeley by the National Guard on orders of then-Governor Ronald Reagan. In the end, the park remained undeveloped, and remains so today. A spin-off, "People's Park Annex", was established at the same time by activist citizens of Berkeley on a strip of land above the Bay Area Rapid Transit ("BART") underground construction along Hearst Avenue northwest of the U.C. campus. The land had also been intended for development, but was turned over to the city by BART and is now Ohlone Park. The era of large public protest in Berkeley waned considerably with the end of the Vietnam War in 1975. While the 1960s were the heyday of liberal activism in Berkeley, it remains one of the most overwhelmingly Democratic cities in the United States. 1970s and 1980s. Housing and zoning changes. After the 1960s, Berkeley banned most new housing construction, in particular apartments. Increasing enrollment at the university led to replacement of older buildings by large apartment buildings, especially in older parts of the city near the university and downtown. Increasing enrollment also led the university to wanting to redevelop certain places of Berkeley, especially Southside, but more specifically People's Park. Preservationists passed the Neighborhood Protection Ordinance in 1973 by ballot measure and the Landmarks Preservation Ordinance in 1974 by the City Council. Together, these ordinances brought most new construction to a halt. Facing rising housing costs, residents voted to enact rent control and vacancy control in 1980. Though more far-reaching in their effect than those of some of the other jurisdictions in California that chose to use rent control where they could, these policies were limited by the Costa-Hawkins Rental Housing Act, a statewide ban on rent control that came into effect in 1995 and limited rent control to multi-family units that were built (or technically buildings that were issued their original certificate of occupation) before the state law came into effect in 1995. For cities such as Berkeley, where rent control was already in place, the law limited the use of rent control to units built before the local rent-control law was enacted, i.e. 1980. Political movements. During the 1970s and 1980s, activists increased their power in local government. This era also saw major developments in Berkeley's environmental and food culture. Berkeley's last Republican mayor, Wallace J. S. Johnson, left office in 1971. Alice Waters opened Chez Panisse in 1971. The first curbside recycling program in the U.S. was started by the Ecology Center in 1973. Styrofoam was banned in 1988. As the city leaned more and more Democratic, local politics became divided between "Progressives" and "Moderates". 1984 saw the Progressives take the majority for the first time. Nancy Skinner became the first UC Berkeley student elected to City Council. In 1986, in reaction to the 1984 election, a ballot measure switched Berkeley from at-large to district-based elections for city council. In 1983, Berkeley's Domestic Partner Task Force was established, which in 1984 made policy recommendation to the school board, which passed domestic partner legislation. The legislation became a model for similar measures nationwide. 1990s and 2000s. In 1995, California's Costa–Hawkins Act ended vacancy control, allowing rents to increase when a tenant moved out. Despite a slow down in 2005–2007, median home prices and rents remain dramatically higher than the rest of the nation, fueled by spillover from the San Francisco housing shortage and population growth. South and West Berkeley underwent gentrification, with some historically Black neighborhoods such as the Adeline Corridor seeing a 50% decline in Black / African American population from 1990 to 2010. In the 1990s, public television's Frontline documentary series featured race relations at Berkeley's only public high school, Berkeley High School. With an economy dominated by the University of California and a high-demand housing market, Berkeley was relatively unaffected by the Great Recession. State budget cuts caused the university to increase the number of out-of-state and international students, with international enrollment, mostly from Asia, rising from 2,785 in 2007 to 5,951 in 2016. Since then, more international restaurants have opened downtown and on Telegraph Avenue, including East Asian chains such as Ippudo and Daiso. A wave of downtown apartment construction began in 1998. In 2006, the Berkeley Oak Grove Protest began protesting construction of a new sports center annex to Memorial Stadium at the expense of a grove of oak trees on the UC campus. The protest ended in September 2008 after a lengthy court process. In 2007–2008, Berkeley received media attention due to demonstrations against a Marine Corps recruiting office in downtown Berkeley and a series of controversial motions by Berkeley's city council regarding opposition to Marine recruiting. ("See" Berkeley Marine Corps Recruiting Center controversy.) 2010s and 2020s. During the fall of 2010, the Berkeley Student Food Collective opened after many protests on the UC Berkeley campus due to the proposed opening of the fast food chain Panda Express. Students and community members worked together to open a collectively run grocery store right off of the UC Berkeley campus, where the community can buy local, seasonal, humane, and organic foods. The Berkeley Student Food Collective still operates at 2440 Bancroft Way. On September 18, 2012, Berkeley became what may be the first city in the U.S. to officially proclaim a day recognizing bisexuals: September 23, which is known as Celebrate Bisexuality Day. On September 2, 2014, the city council approved a measure to provide free medical marijuana to low-income patients. The Measure D soda tax was approved by Berkeley voters on November 4, 2014, the first such tax in the United States. Protests. In the fall of 2011, the nationwide Occupy Wall Street movement came to two Berkeley locations: on the campus of the University of California and as an encampment in Civic Center Park. During a Black Lives Matter protest on December 6, 2014, police use of tear gas and batons to clear protesters from Telegraph Avenue led to a riot and five consecutive days and nights of protests, marches, and freeway occupations in Berkeley and Oakland. Afterwards, changes were implemented by the Police Department to avoid escalation of violence and to protect bystanders during protests. During a protest against bigotry and U.S. President Donald Trump in August 2017, self-described anti-fascist protesters attacked Trump supporters in attendance. Police intervened, arresting 14 people. Sometimes called "antifa", these ‘anti-fascist’ activists were clad in black shirts and other black attire, while some carried shields and others had masks or bandanas hiding their faces to help them evade capture after street fighting. . These protests spanned February to September 2017 (See more at 2017 Berkeley Protests). In 2019, protesters took up residence in People's Park against tree-chopping and were arrested by police in riot gear. Many activists saw this as the university preparing to develop the park. Homelessness. The city of Berkeley has historically been a central location for homeless communities in the Bay Area. Since the 1930s, the city of Berkeley has fostered a tradition of political activism. The city has been perceived as a hub for liberal thought and action and it has passed ordinances to oust homeless individuals from Berkeley on multiple occasions. Despite efforts to remove unhoused individuals from the streets and projects to improve social service provision for this demographic, homelessness has continued to be a significant problem in Berkeley. 1960s. A culture of anti-establishment and sociopolitical activism marked the 1960s. The San Francisco Bay Area became a hotspot for hippie counterculture, and Berkeley became a haven for nonconformists and anarchists from all over the United States. Most public discourse around homelessness in Berkeley at this time was centered around the idea of street-living as an expression of counterculture. During the Free Speech Movement in the fall of 1964, Berkeley became a hub of civil unrest, with demonstrators and UC Berkeley students sympathizing with the statewide protests for free speech and assembly, as well as revolting against university restrictions against student political activities and organizations established by UC President Clark Kerr in 1959. Many non-student youth and adolescents sought alternative lifestyles and opted for voluntary homelessness during this time. In 1969, People's Park was created and eventually became a haven for "small-time drug dealers, street people, and the homeless". Although the City of Berkeley has moved unhoused individuals from its streets, sometimes even relocating them to an unused landfill, People's Park has remained a safe space for them since its inception. The park has become one of the few relatively safe spaces for homeless individuals to congregate in Berkeley and the greater Bay Area. 1970s. Stereotypes of homeless people as deviant individuals who chose to live vagrant lifestyles continued to color the discourse around street-dwellers in American cities. However, this time period was also characterized by a subtle shift in the perception of unhoused individuals. The public began to realize that homelessness affected not only single men, but also women, children, and entire families. This recognition set the stage for the City of Berkeley's attitude towards homelessness in the next decade. 1980s. Organizations such as Building Opportunities for Self Sufficiency (BOSS) were established in 1971 in response to the needs of individuals with mental illness being released to the streets by state hospital closures. 1990s. In the 1990s, the City of Berkeley faced a substantial increase in the need for emergency housing shelters and saw a rise in the average amount of time individuals spent without stable housing. As housing became a more widespread problem, the general public, Berkeley City Council, and the University of California became increasingly anti-homeless in their opinions. In 1994, Berkeley City Council considered the implementation of a set of anti-homeless laws that the "San Francisco Chronicle" described as being "among the strictest in the country". These laws prohibited sitting, sleeping and begging in public spaces, and outlawed panhandling from people in a variety of contexts, such as sitting on public benches, buying a newspaper from a rack, or waiting in line for a movie. In February 1995, the American Civil Liberties Union (ACLU) sued the city for infringing free speech rights through its proposed anti-panhandling law. The following month, the Street Spirit, a monthly newspaper written for and by people experiencing homelessness, published its first of hundreds of issues covering homelessness in the Bay Area and across the nation. In May of that same year, a federal judge ruled that the anti-panhandling law did violate the First Amendment, but left the anti-sitting and sleeping laws untouched. Following the implementation of these anti-sitting and sleeping ordinances in 1998, Berkeley increased its policing of homeless adults and youth, particularly in the shopping district surrounding Telegraph Avenue. The mayor at that time, Shirley Dean, proposed a plan to increase both social support services for homeless youth and enforcement of anti-encampment laws. Unhoused youth countered this plan with a request for the establishment of the city's first youth shelter, more trash cans, and more frequent cleaning of public bathrooms. 21st century. The City of Berkeley's 2017 annual homeless report and point-in-time count (PIT) estimate that on a given night, 972 people are homeless. Sixty-eight percent (664 people) of these individuals are also unsheltered, living in places not considered suitable for human habitation, such as cars or streets. Long-term homelessness in Berkeley is double the national average, with 27% of the city's homeless population facing chronic homelessness. Chronic homelessness has been on the rise since 2015, and has been largely a consequence of the constrained local housing market. In 2015, rent in Alameda County increased by 25%, while the average household income only grew by 5%. The City of Berkeley's 2017 report also estimated the number of unaccompanied youth in Berkeley at 189 individuals, 19% of the total homeless population in the city. Homeless youth display greater risk of mental health issues, behavioral problems, and substance abuse, than any other homeless age group. Furthermore, homeless youth identifying as LGBTQ+ are exposed to greater rates of physical and sexual abuse, and higher risk for sexually-transmitted diseases, predominantly HIV. The City of Berkeley has seen a consistent rise in the number of chronically homeless individuals over the past 30 years, and has implemented a number of different projects to reduce the number of people living on the streets. In 2008, the City focused its efforts on addressing chronic homelessness. This led to a 48% decline in the number of chronically homeless individuals reported in the 2009 Berkeley PIT. However, the number of "hidden homeless" individuals (those coping with housing insecurity by staying at a friend or relative's residence), increased significantly, likely in response to rising housing costs and costs of living. In 2012, the City considered measures that banned sitting in commercial areas throughout Berkeley. The measure was met with strong public opposition and did not pass. However, the City saw a strong need for it to implement rules addressing encampments and public usage of space as well as assessing the resources needed to assist the unhoused population. In response to these needs the City of Berkeley established the Homeless Task Force, headed by then-Councilmember Jesse Arreguín. Since its formation, the Task Force has proposed a number of different recommendations, from expanding the City Homeless Outreach and Mobile Crisis Teams, to building a short-term transitional shelter for unhoused individuals. Geography. According to the United States Census Bureau, the city's area includes of land and (40.83%) water, most of it part of San Francisco Bay. Berkeley borders the cities of Albany, Oakland, and Emeryville and Contra Costa County, including unincorporated Kensington, as well as San Francisco Bay. Berkeley lies within telephone area code 510 (until September 2, 1991, Berkeley was part of the 415 telephone code that now covers only San Francisco and Marin counties), and the postal ZIP codes are 94701 through 94710, 94712, and 94720 for the University of California campus. Geology. Most of Berkeley lies on a rolling sedimentary plain that rises gently from sea level to the base of the Berkeley Hills. East of the Hayward Fault along the base of the hills, elevation increases more rapidly. The highest peak along the ridge line above Berkeley is Grizzly Peak, at an elevation of . A number of small creeks run from the hills to the Bay through Berkeley: Cerrito, Codornices, Schoolhouse, and Strawberry Creeks are the principal streams. Most of these are largely culverted once they reach the plain west of the hills. The Berkeley Hills are part of the Pacific Coast Ranges, and run in a northwest–southeast alignment. Exposed in the Berkeley Hills are cherts and shales of the Claremont Formation (equivalent to the Monterey Formation), conglomerate and sandstone of the Orinda Formation and lava flows of the Moraga Volcanics. Of similar age to the Moraga Volcanics (extinct), within the Northbrae neighborhood of Berkeley, are outcroppings of erosion resistant rhyolite. These rhyolite formations can be seen in several city parks and in the yards of a number of private residences. Indian Rock Park in the northeastern part of Berkeley near the Arlington/Marin Circle features a large example. Earthquakes. Berkeley is traversed by the Hayward Fault Zone, a major branch of the San Andreas Fault to the west. No large earthquake has occurred on the Hayward Fault near Berkeley in historic times (except possibly in 1836), but seismologists warn about the geologic record of large temblors several times in the deeper past. The current assessment is that a Bay Area earthquake of magnitude 6.7 or greater within the next 30 years is likely, with the Hayward Fault having the highest likelihood among faults in the Bay Area of being the epicenter. Moreover, like much of the Bay Area, Berkeley has many areas of some risk to soil liquefaction, with the flat areas closer to the shore at low to high susceptibility. The 1868 Hayward earthquake did occur on the southern segment of the Hayward Fault in the vicinity of today's city of Hayward. This quake destroyed the county seat of Alameda County then located in San Leandro and it subsequently moved to Oakland. It was strongly felt in San Francisco, causing major damage. It was regarded as the "Great San Francisco earthquake" prior to 1906. It produced a furrow in the ground along the fault line in Berkeley, across the grounds of the new State Asylum for the Deaf, Dumb and Blind then under construction, which was noted by one early University of California professor. Although no significant damage was reported to most of the few Berkeley buildings of the time, the 1868 quake did destroy the vulnerable adobe home of Domingo Peralta in north Berkeley. Today, evidence of the Hayward Fault's "creeping" is visible at various locations in Berkeley. Cracked roadways, sharp jogs in streams, and springs mark the fault's path. However, since it cuts across the base of the hills, the creep is often concealed by or confused with slide activity. Some of the slide activity itself, however, results from movement on the Hayward Fault. A notorious segment of the Hayward Fault runs lengthwise down the middle of Memorial Stadium at the mouth of Strawberry Canyon on the University of California campus. Photos and measurements show the movement of the fault through the stadium. Climate. Berkeley has a warm-summer Mediterranean climate ("Csb" in the Köppen climate classification), with warm, dry summers and cool, wet winters. Berkeley's location directly opposite the Golden Gate ensures that typical eastward fog flow blankets the city more often than its neighbors. The summers are cooler than a typical Mediterranean climate thanks to upwelling ocean currents along the California coast. These help produce cool and foggy nights and mornings. Winter is punctuated with rainstorms of varying ferocity and duration, but also produces stretches of bright sunny days and clear cold nights. It does not normally snow, though occasionally the hilltops get a dusting. Spring and fall are transitional and intermediate, with some rainfall and variable temperature. Summer typically brings night and morning low clouds or fog, followed by sunny, warm days. The warmest and driest months are typically June through September, with the highest temperatures occurring in September. Mid-summer (July–August) is often a bit cooler due to the sea breezes and fog common then. In a year, there are an average of 2.9 days with highs of or higher, and an average of 0.8 days with lows of or lower. The highest recorded temperature was on June 15, 2000, and July 16, 1993, and the lowest recorded temperature was on December 22, 1990. February is normally the wettest month, averaging of precipitation. Average annual precipitation is , falling on an average of 63.7 days each year. The most rainfall in one month was in February 1998. The most rainfall in 24 hours was on January 4, 1982. As in most of California, the heaviest rainfall years are usually associated with warm water El Niño episodes in the Pacific (e.g., 1982–83; 1997–98), which bring in drenching "Pineapple Express" storms. In contrast, dry years are often associated with cold Pacific La Niña episodes. Light snow has fallen on rare occasions. Snow has generally fallen every several years on the higher peaks of the Berkeley Hills. In the late spring and early fall, strong offshore winds of sinking air typically develop, bringing heat and dryness to the area. In the spring, this is not usually a problem as vegetation is still moist from winter rains, but extreme dryness prevails by the fall, creating a danger of wildfires. In September 1923, a major fire swept through the neighborhoods north of the university campus, stopping just short of downtown. On October 20, 1991, gusty, hot winds fanned a conflagration along the Berkeley–Oakland border, killing 25 people and injuring 150, as well as destroying 2,449 single-family dwellings and 437 apartment and condominium units. Demographics. 2020 census and estimates. The 2020 United States census reported that Berkeley had a population of 124,321. The population density was 11,874 people per square mile of land area (4,584/km2). The racial and ethnic makeup (where Latinos are excluded from the racial counts and treated as if a separate race) of Berkeley was 62,450 (50.2%) White, 9,495 (7.6%) Black or African American, 24,701 (19.9%) Asian, 253 (0.2%) Pacific Islander, 226 (0.2%) from Native American, 1,109 (0.9%) from other races, and 9,069 (7.2%) multiracial (two or more races). There were 17,018 (13.7%) of Hispanic or Latino ancestry, of any race. According to the 2022 American Community Survey estimate, the median income for a household was $104,716 and the median income for a family was $177,068. Males had a median income of $102,565 versus $82,772 for females. The per capita income for the city was $63,310. About 4.3% of families and 17.7% of the population were below the poverty line, including 6% of those under age 18 and 8.1% of those age 65 or over. Earlier census data. From the 2010 United States census, the racial makeup of Berkeley was 66,996 (59.5%) White, 11,241 (10.0%) Black or African American, 479 (0.4%) Native American, 21,690 (19.3%) Asian (8.4% Chinese, 2.4% Indian, 2.1% Korean, 1.6% Japanese, 1.5% Filipino, 1.0% Vietnamese), 186 (0.2%) Pacific Islander, 4,994 (4.4%) from other races, and 6,994 (6.2%) from two or more races. There were 12,209 people (10.8%) of Hispanic or Latino ancestry, of any race. 6.8% of the city's population was of Mexican ancestry. The Census reported that 99,731 people (88.6% of the population) lived in households, 12,430 (11.0%) lived in non-institutionalized group quarters, and 419 (0.4%) were institutionalized. There were 46,029 households, out of which 8,467 (18.4%) had children under the age of 18 living in them, 13,569 (29.5%) were opposite-sex married couples living together, 3,855 (8.4%) had a female householder with no husband present, 1,368 (3.0%) had a male householder with no wife present. There were 2,931 (6.4%) unmarried opposite-sex partnerships, and 961 (2.1%) same-sex married couples or partnerships. 16,904 households (36.7%) were made up of individuals, and 4,578 (9.9%) had someone living alone who was 65 years of age or older. The average household size was 2.17. There were 18,792 families (40.8% of all households); the average family size was 2.81. There were 49,454 housing units at an average density of , of which 46,029 were occupied, of which 18,846 (40.9%) were owner-occupied, and 27,183 (59.1%) were occupied by renters. The homeowner vacancy rate was 1.0%; the rental vacancy rate was 4.5%. 45,096 people (40.1% of the population) lived in owner-occupied housing units and 54,635 people (48.5%) lived in rental housing units. In the city, 13,872 people (12.3%) were under the age of 18, 30,295 (26.9%) were aged 18 to 24, 30,231 (26.9%) aged 25 to 44, 25,006 (22.2%) aged 45 to 64, and 13,176 (11.7%) were 65 years of age or older. The median age was 31.0 years. For every 100 females, there were 95.6 males. For every 100 females age 18 and over, there were 94.2 males. According to the 2011 American Community Survey 5-Year estimate, the median income for a household in the city was $60,908, and the median income for a family was $102,976. Males had a median income of $67,476 versus $57,319 for females. The per capita income for the city was $38,896. About 7.2% of families and 18.3% of the population were below the poverty line, including 13.2% of those under age 18 and 9.2% of those age 65 or over. Berkeley has a higher-than-average crime rate, particularly property crime, though the crime rate has fallen significantly since 2000. Transportation. Berkeley is served by Amtrak (Capitol Corridor), AC Transit, BART (Ashby, Downtown Berkeley Station and North Berkeley) and bus shuttles operated by major employers including UC Berkeley and Lawrence Berkeley National Laboratory. The Eastshore Freeway (Interstate 80 and Interstate 580) runs along the bay shoreline. Each day there is an influx of thousands of cars into the city by commuting UC faculty, staff and students, making parking for more than a few hours an expensive proposition. Berkeley has one of the highest rates of bicycle and pedestrian commuting in the nation. Berkeley is the safest city of its size in California for pedestrians and cyclists, considering the number of injuries per pedestrian and cyclist, rather than per capita. Berkeley has modified its original grid roadway structure through use of diverters and barriers, moving most traffic out of neighborhoods and onto arterial streets (visitors often find this confusing, because the diverters are not shown on all maps). Berkeley maintains a separate grid of arterial streets for bicycles, called Bicycle Boulevards, with bike lanes and lower amounts of car traffic than the major streets they often parallel. Attempts to improve the biking infrastructure in Berkeley have been met with controversy. In 2023, the Berkeley city council fired the city's top transportation official for a plan to remove dozens of parking spots on a street to build a protected bike lane. Berkeley hosts car sharing network Zipcar. Rather than owning (and parking) their own cars, members share a group of cars parked nearby. Web- and telephone-based reservation systems keep track of hours and charges. Several "pods" (points of departure where cars are kept) exist throughout the city, in several downtown locations, at the Ashby and North Berkeley BART stations, and at various other locations in Berkeley (and other cities in the region). Using alternative transportation is encouraged. Berkeley has had recurring problems with parking meter vandalism. In 1999, over 2,400 Berkeley meters were jammed, smashed, or sawed apart. Starting in 2005 and continuing into 2006, Berkeley began to phase out mechanical meters in favor of more centralized electronic meters. Transportation history. The first commuter service to San Francisco was provided by the Central Pacific's Berkeley Branch Railroad, a standard gauge steam railroad, which terminated in downtown Berkeley, and connected in Emeryville (at a locale then known as "Shellmound") with trains to the Oakland ferry pier as well as with the Central Pacific main line starting in 1876. The Berkeley Branch line was extended from Shattuck and University to Vine Street ("Berryman's Station") in 1878. Starting in 1882, Berkeley trains ran directly to the Oakland Pier. In the 1880s, Southern Pacific assumed operations of the Berkeley Branch under a lease from its own paper affiliate, the Northern Railway. In 1911, Southern Pacific electrified this line and the several others it constructed in Berkeley, creating its East Bay Electric Lines division. The huge and heavy cars specially built for these lines were called the "Red Trains" or the "Big Red Cars". The Shattuck line was extended and connected with two other Berkeley lines (the Ninth Street Line and the California Street line) at Solano and Colusa (the "Colusa Wye"). At this time, the Northbrae Tunnel and Rose Street Undercrossing were constructed, both of which still exist. (The Rose Street Undercrossing is not accessible to the public, being situated between what is now two backyards.) The fourth Berkeley line was the Ellsworth St. line to the university campus. The last Red Trains ran in July 1941. The first electric rail service in Berkeley was provided by several small streetcar companies starting in 1891. Most of these were eventually bought up by the Key System of Francis "Borax" Smith who added lines and improved equipment. The Key System's streetcars were operated by its East Bay Street Railways division. Principal lines in Berkeley ran on Euclid, The Arlington, , Telegraph, Shattuck, San Pablo, , and Grove (today's Martin Luther King Jr. Way). The last streetcars ran in 1948, replaced by buses. The first electric commuter interurban-type trains to San Francisco from Berkeley were put in operation by the Key System in 1903, several years before the Southern Pacific electrified its steam commuter lines. Like the SP, Key trains ran to a pier serviced by the Key's own fleet of ferryboats, which also docked at the Ferry Building in San Francisco. After the Bay Bridge was built, the Key trains ran to the Transbay Terminal in San Francisco, sharing tracks on the lower deck of the Bay Bridge with the SP's red trains and the Sacramento Northern Railroad. It was at this time that the Key trains acquired their letter designations, which were later preserved by Key's public successor, AC Transit. Today's F bus is the successor of the F train. Likewise, the E, G and the H. Before the Bridge, these lines were simply the Shattuck Avenue Line, the Claremont Line, the Westbrae Line, and the Sacramento Street Line, respectively. After the Southern Pacific abandoned transbay service in 1941, the Key System acquired the rights to use its tracks and catenary on Shattuck north of Dwight Way and through the Northbrae Tunnel to The Alameda for the F-train. The SP tracks along Monterey Avenue as far as Colusa had been acquired by the Key System in 1933 for the H-train, but were abandoned in 1941. The Key System trains stopped running in April 1958. On December 15, 1962, the Northbrae Tunnel was opened to auto traffic. Economy. Top employers. According to the city's 2024 Annual Comprehensive Financial Report, the top employers in the city are: Businesses. Berkeley is the location of a number of nationally prominent businesses, many of which have been pioneers in their areas of operation. Notable businesses include Chez Panisse, birthplace of California cuisine, Peet's Coffee's original store, the Claremont Resort, punk rock haven 924 Gilman, Saul Zaentz's Fantasy Studios, and Caffe Strada. Notable former businesses include pioneer bookseller Cody's Books, The Nature Company, The North Face, Clif Bar energy foods, the Berkeley Co-op, and Caffe Mediterraneum. Berkeley has relatively few chain stores for a city of its size, due to policies and zoning that promote small businesses and impose limits on the size of certain types of stores. Places. Neighborhoods. Berkeley has a number of distinct neighborhoods. Surrounding the University of California campus are the most densely populated parts of the city. West of the campus is Downtown Berkeley, the city's traditional commercial core; home of the civic center, the city's only public high school, the busiest BART station in Berkeley, as well as a major transfer point for AC Transit buses. South of the campus is Southside, mainly a student ghetto, where much of the university's student housing is located. The busiest stretch of Telegraph Avenue is in this neighborhood. North of the campus is the quieter Northside neighborhood, the location of the Graduate Theological Union. Farther from the university campus, the influence of the university quickly becomes less visible. Most of Berkeley's neighborhoods are primarily made up of detached houses, often with separate in-law units in the rear, although larger apartment buildings are also common in many neighborhoods. Commercial activities are concentrated along the major avenues and at important intersections and frequently define the neighborhood within which they reside. In the southeastern corner of the city is the Claremont District, home to the Claremont Hotel. Also in the southeast is the Elmwood District known for its commercial area on College Avenue. West of Elmwood is South Berkeley, known for its weekend flea market at the Ashby Station. West of (and including) San Pablo Avenue, itself a major commercial and transport corridor, is West Berkeley, the historic commercial center of the city. This neighborhood and area includes the former unincorporated town of Ocean View. West Berkeley contains the remnants of Berkeley's industrial area, much of which has been replaced by retail and office uses, as well as residential live/work loft space, paralleling the decline of manufacturing in the United States. This area abuts the shoreline of the San Francisco Bay and is home to the Berkeley Marina. Also nearby is Berkeley's Aquatic Park, featuring an artificial linear lagoon of San Francisco Bay. North of downtown is North Berkeley which has its main commercial area nicknamed the "Gourmet Ghetto" because of the concentration of well-known restaurants and other food-related businesses. West of North Berkeley (roughly west of Sacramento and north of Cedar) is Westbrae, a small neighborhood centered on a small commercial area on Gilman Street and through which part of the Ohlone Greenway runs. Meanwhile, further north of North Berkeley are Northbrae, a master-planned subdivision from the early 20th century, and Thousand Oaks. Above these last three neighborhoods, on the western slopes of the Berkeley Hills are the neighborhoods of Cragmont and La Loma Park, notable for their dramatic views, winding streets, and numerous public stairways and paths. Parks and recreation. The city has many parks, and promotes greenery and the environment. The city has planted trees for years and is a leader in the nationwide effort to re-tree urban areas. Tilden Regional Park lies east of the city, occupying the upper extent of Wildcat Canyon between the Berkeley Hills and the San Pablo Ridge. The city is also heavily involved in creek restoration and wetlands restoration, including a planned "daylighting" of Strawberry Creek along Center Street. The Berkeley Marina and East Shore State Park flank its shoreline at San Francisco Bay and organizations like the Urban Creeks Council and Friends of the Five Creeks, the former of which is headquartered in Berkeley, support the riparian areas in the town and coastlines as well. César Chávez Park, near the Berkeley Marina, was built at the former site of the city dump. Landmarks and historic districts. 165 buildings in Berkeley are designated as local landmarks or local structures of merit. Of these, 49 are listed in the National Register of Historic Places, including: Historic districts listed in the National Register of Historic Places: Arts and culture. Berkeley is home to the Chilean-American community's La Peña Cultural Center, the largest cultural center for this community in the United States. The Freight and Salvage is the oldest established full-time folk and traditional music venue west of the Mississippi River. Additionally, Berkeley is home to the off-broadway theater Berkeley Repertory Theater, commonly known as "Berkeley Rep". The Berkeley Repertory Theater consists of two stages, a school, and has received a Tony Award for Outstanding Regional Theatre. The historic Berkeley Art Museum and Pacific Film Archive (BAMPFA) is operated by UC Berkeley, and was moved to downtown Berkeley in January 2016. It offers many exhibitions and screenings of historic films, as well as outreach programs within the community. Education. Colleges and universities. University of California, Berkeley's main campus is in the city limits. The Graduate Theological Union, a consortium of eight independent theological schools, is located a block north of the University of California Berkeley's main campus. The Graduate Theological Union has the largest number of students and faculty of any religious studies doctoral program in the United States. In addition to more theological schools, Zaytuna College, a newly established Muslim liberal-arts college, has taken 'Holy Hill' as its new home. The Institute of Buddhist Studies has been located in Berkeley since 1966. Wright Institute, a psychology graduate school, is located in Berkeley. Berkeley City College is a community college in the Peralta Community College District. Primary and secondary schools. The Berkeley Unified School District operates public schools. The first public school in Berkeley was the Ocean View School, now the site of the Berkeley Adult School located at Virginia Street and San Pablo Avenue. The public schools today are administered by the Berkeley Unified School District. In the 1960s, Berkeley was one of the earliest US cities to voluntarily desegregate, utilizing a system of buses, still in use. The district has eleven elementary schools and one public high school, Berkeley High School (BHS). Established in 1880, BHS currently has over 3,000 students. The Berkeley High campus was designated a historic district by the National Register of Historic Places on January 7, 2008. Saint Mary's College High School, a Catholic school, also has its street address in Berkeley, although most of the grounds and buildings are actually in neighboring Albany. Berkeley has 11 public elementary schools and three middle schools. The East Bay campus of the German International School of Silicon Valley (GISSV) formerly occupied the Hillside Campus, Berkeley, California; it opened there in 2012. In December 2016, the GISSV closed the building, due to unmet seismic retrofit needs. Public libraries. Berkeley Public Library serves as the municipal library. University of California, Berkeley Libraries operates the University of California Berkeley libraries. Government. Berkeley has a council–manager government. The mayor is elected at-large for a four-year term and is the ceremonial head of the city and the chair of the city council. The Berkeley City Council is composed of the mayor and eight council members elected by district who each serve four-year terms. Districts 2, 3, 5 and 6 hold their elections in years divisible by four while Districts 1, 4, 7 and 8 hold theirs in even-numbered years not divisible by four. The city council appoints a city manager, who is the chief executive of the city. Additionally, the city voters directly elect an independent city auditor, school board, and rent stabilization board. Most city officials, including council members, are elected using instant-runoff voting since November 2010. Kriss Worthington, elected in 1996 to represent District 7, became the first openly LGBT man elected to the Berkeley City Council. Lori Droste, elected in 2014 to represent District 8, became the first openly LGBT woman elected to the Berkeley City Council. Terry Taplin, elected in 2020 to represent District 2, became the first openly LGBT person of color elected to the Berkeley City Council. In 1973, Ying Lee became the first Asian American city councilmember in Berkeley. Jenny Wong, elected in 2018, is the first Asian American City Auditor in Berkeley. In 2014, the City of Berkeley passed a redistricting measure on the ballot to create the nation's first student supermajority district in District 7, which in 2018 elected Rigel Robinson, a 22-year-old UC Berkeley graduate and the youngest Councilmember in the city's history. While much of UC Berkeley's student housing is located in District 7, large dormitories also exist in Districts 6 and 8. Districts 4 and 7 are majority-student. Nancy Skinner became the first student to serve on the City Council, elected in 1984 as a graduate student. Rigel Robinson, who was elected in 2018, became the second student to serve on the City Council as he completed his master's degree while in office. Adena Ishii became the first woman of color to be elected Mayor of Berkeley in November 2024. The city's Public Health Division is one of four municipally-operated public health agencies in California (the other three being Long Beach, Pasadena, and Vernon). Though it is part of the city government, it qualifies for the same state funds as a county public health department. Berkeley is also part of Alameda County, for which the Government of Alameda County is defined and authorized under the California Constitution, California law, and the Charter of the County of Alameda. The county government provides countywide services, such as elections and voter registration, law enforcement, jails, vital records, property records, tax collection, and social services. The county's health department does not cover the city. The county government is primarily composed of the elected five-member Board of Supervisors, other elected offices including the Sheriff/Coroner, the District Attorney, Assessor, Auditor-Controller/County Clerk/Recorder, and Treasurer/Tax Collector, and numerous county departments and entities under the supervision of the County Administrator. In addition to the Berkeley Unified School District (which is coterminous with the city), Berkeley is also part of the Bay Area Rapid Transit District (BART), the Alameda-Contra Costa Transit District (AC Transit), the East Bay Regional Park District, the East Bay Municipal Utility District, and the Peralta Community College District. Politics. Berkeley has been a Democratic stronghold in presidential elections since 1960, becoming one of the most Democratic cities in the country. The last Republican presidential candidate to receive at least one-quarter of the vote in Berkeley was Richard Nixon in 1968. Consistent with Berkeley's reputation as a strongly liberal and/or progressive city, in the 2016 presidential election more votes were won by Green Party presidential candidate Jill Stein than by Republican candidate Donald Trump. In the 2020 Presidential election, Joe Biden received 93.8% of the vote while Donald Trump received 4.0% of the vote. According to the California Secretary of State, as of August 30, 2021, Berkeley has 75,390 registered voters. Of those, 56,740 (75.26%) are registered Democrats, 1,910 (2.53%) are registered Republicans, 14,106 (18.71%) have declined to state a political party affiliation, and 2,634 (3.49%) are registered with a third party. Berkeley became the first city in the United States to pass a sanctuary resolution on November 8, 1971. Media. The city had a daily newspaper, the "Berkeley Gazette", which was founded and folded in 1984. The "Berkeley Barb" published counter-culture news from 1965 to 1980. Current media include "The Daily Californian", the student newspaper of UC Berkeley, the "Berkeley Times", and local online-only publications "Berkeleyside", the "Berkeley Daily Planet", and "The Berkeley Scanner". Notable people. Notable individuals who were born in and/or have lived in Berkeley include Kamala Harris, Steve Wozniak, scientists J. Robert Oppenheimer and Ernest Lawrence, actors Ben Affleck, and Andy Samberg, Major League Baseball broadcaster Matt Vasgersian, model Rebecca Romijn, Billie Joe Armstrong, lead singer of Green Day, Adam Duritz of Counting Crows, rapper Lil B, authors Ursula K. Le Guin and Michael Chabon, entertainment and real estate mogul Herbie Herbert, and EDM producer KSHMR, and university presidents Blake R. Van Leer and Darryll Pines and adult education advocate Gretchen Bersch. Sister cities. Berkeley has 17 sister cities:
4861
2278355
https://en.wikipedia.org/wiki?curid=4861
Bolventor
Bolventor () is a hamlet on Bodmin Moor in Cornwall, England, United Kingdom. It is situated in Altarnun civil parish between Launceston and Bodmin. Toponymy. The hamlet has been said to take its name from the "Bold Venture" that it must have appeared to build a farm in this moorland, but this is probably folk etymology, as "Bol-" is a common prefix in Cornish placenames. It is much more likely that the name derives from the 'Bold Adventure' tin-working area which was in operation near Jamaica Inn during the 1840s-1850s Jamaica Inn. Bolventor is the location of the famous Jamaica Inn coaching inn. It is bypassed by a dual carriageway section of the A30 trunk road; before the bypass was built the hamlet straddled the A30 road. Daphne du Maurier, a former resident, chose Bolventor as the setting for her novel about Cornish smugglers titled "Jamaica Inn". The inn that inspired the novel, Jamaica Inn, has stood beside the main road through the village since 1547. It is now a tourist attraction in its own right and dominates the hamlet. The Jamaica Inn was the subject of a paranormal investigation during a 2004 episode of reality television programme "Most Haunted". Church. The former Holy Trinity Church that lies to the east of the hamlet closed some years ago. A mile from Bolventor there was a chapel of St Luke (from the 13th to the early 16th century): the font is now at the church of Tideford. Bolventor parish was established in 1846 (before that date the village was in St Neot parish; the new parish was made up of parts of St Neot, Altarnun and Cardinham parishes) but has now been merged with Altarnun. 1945 air disaster. On 14 September 1945 a Royal Air Force Handley Page Halifax Mk VII (PN305) was operating a flight from RAF Tarrant Rushton, Dorset, to Lajes Field, Azores, Portugal. During the flight an electrical failure occurred causing a dingy inside the wing to inflate dislodge from stowage. The dingy wrapped around the tail assembly and the aircraft went into a nose dive, crashing into the Priddacombe area of Bolventor. All 21 on board, seven crew and fourteen passengers, died in the crash.
4862
43565111
https://en.wikipedia.org/wiki?curid=4862
Bengal
Bengal ( ) is a historical geographical, ethnolinguistic and cultural term referring to a region in the eastern part of the Indian subcontinent at the apex of the Bay of Bengal. The region of Bengal proper is divided between the modern-day sovereign nation of Bangladesh and the Indian states of West Bengal, Tripura and Karimganj district of Assam. The ancient Vanga Kingdom is widely regarded as the namesake of the Bengal region. The Bengali calendar dates back to the reign of Shashanka in the 7th century CE. The Pala Empire was founded in Bengal during the 8th century. The Sena dynasty and Deva dynasty ruled between the 11th and 13th centuries. By the 14th century, Bengal was absorbed by Muslim conquests in the Indian subcontinent. An independent Bengal Sultanate was formed and became the eastern frontier of the Islamic world. During this period, Bengal's rule and influence spread to Assam, Arakan, Tripura, Bihar, and Odisha (formerly- Orissa). Bengal Subah later emerged as a prosperous part of the Mughal Empire. The last independent Nawab of Bengal was defeated in 1757 at the Battle of Plassey by the East India Company. The company's Bengal Presidency grew into the largest administrative unit of British India with Calcutta as the capital of both Bengal and India until 1911. As a result of the first partition of Bengal, a short-lived province called Eastern Bengal and Assam existed between 1905 and 1911 with its capital in the former Mughal capital Dhaka. Following the Sylhet referendum and votes by the Bengal Legislative Council and Bengal Legislative Assembly, the region was again divided along religious lines in 1947. Bengali culture, particularly its literature, music, art and cinema, are well known in South Asia and beyond. The region is also notable for its economic and social scientists, which includes several Nobel laureates. Once home to the city with the highest per capita income level in British India, the region is today a leader in South Asia in terms of gender parity, the gender pay gap and other indices of human development. Etymology. The name of "Bengal" is derived from the ancient kingdom of Vanga (pronounced Bôngô), the earliest records of which date back to the "Mahabharata" epic in the first millennium BCE. The reference to 'Vangalam' is present in an inscription in the Brihadisvara Temple at Thanjavur, which is one of the oldest references to Bengal. The term "Vangaladesa" is used to describe the region in 11th-century South Indian records. The modern term "Bangla" is prominent from the 14th century, which saw the establishment of the Sultanate of Bengal, whose first ruler Shamsuddin Ilyas Shah was known as the "Shah of Bangala". Arab geographers Ahmad ibn Majid and Sulaiman Al Mahri also mentions the region as "Bangala" and the 'land of Bang'. The Portuguese referred to the region as "Bengala" in the Age of Discovery. History. Antiquity. Neolithic sites have been found in several parts of the region. In the second millennium BCE, rice-cultivating communities dotted the region. By the eleventh century BCE, people in Bengal lived in systematically aligned homes, produced copper objects, and crafted black and red pottery. Remnants of Copper Age settlements are located in the region. At the advent of the Iron Age, people in Bengal adopted iron-based weapons, tools and irrigation equipment. From 600 BCE, the second wave of urbanisation engulfed the north Indian subcontinent as part of the Northern Black Polished Ware culture. Ancient archaeological sites and cities in Dihar, Pandu Rajar Dhibi, Mahasthangarh, Chandraketugarh and Wari-Bateshwar emerged. The Ganges, Brahmaputra and Meghna rivers were natural arteries for communication and transportation. Estuaries on the Bay of Bengal allowed for maritime trade with distant lands in Southeast Asia and elsewhere. The ancient geopolitical divisions of Bengal included Varendra, Suhma, Anga, Vanga, Samatata and Harikela. These regions were often independent or under the rule of larger empires. The Mahasthan Brahmi Inscription indicates that Bengal was ruled by the Mauryan Empire in the 3rd century BCE. The inscription was an administrative order instructing relief for a distressed segment of the population. Punch-marked coins found in the region indicate that coins were used as currency during the Iron Age. The namesake of Bengal is the ancient Vanga Kingdom which was reputed as a naval power with overseas colonies. A prince from Bengal named Vijaya founded the first kingdom in Sri Lanka. The two most prominent pan-Indian empires of this period included the Mauryans and the Gupta Empire. The region was a centre of artistic, political, social, spiritual and scientific thinking, including the invention of chess, Indian numerals, and the concept of zero. The region was known to the ancient Greeks and Romans as Gangaridai. The Greek ambassador Megasthenes chronicled its military strength and dominance of the Ganges delta. The invasion army of Alexander the Great was deterred by the accounts of Gangaridai's power in 325 BCE, including a cavalry of war elephants. Later Roman accounts noted maritime trade routes with Bengal. 1st century Roman coins with images of Hercules were found in the region and point to trade links with Roman Egypt through the Red Sea. The Wari-Bateshwar ruins are believed to be the emporium (trading centre) of Sounagoura mentioned by Roman geographer Claudius Ptolemy. A Roman amphora was found in Purba Medinipur district of West Bengal which was made in Aelana (present-day Aqaba, Jordan) between the 4th and 7th centuries AD. The first unified Bengali polity can be traced to the reign of Shashanka. The origins of the Bengali calendar can be traced to his reign. Shashanka founded the Gauda Kingdom. After Shashanka's death, Bengal experienced a period of civil war known as Matsyanyayam. The ancient city of Gauda later gave birth to the Pala Empire. The first Pala emperor Gopala I was chosen by an assembly of chieftains in Gauda. The Pala kingdom grew into one of the largest empires in the Indian subcontinent. The Pala period saw advances in linguistics, sculpture, painting, and education. The empire achieved its greatest territorial extent under Dharmapala and Devapala. The Palas vied for control of Kannauj with the rival Gurjara-Pratihara and Rashtrakuta dynasties. Pala influence also extended to Tibet and Sumatra due to the travels and preachings of Atisa. The university of Nalanda was established by the Palas. They also built the Somapura Mahavihara, which was the largest monastic institution in the subcontinent. The rule of the Palas eventually disintegrated. The Chandra dynasty ruled southeastern Bengal and Arakan. The Varman dynasty ruled parts of northeastern Bengal and Assam. The Sena dynasty emerged as the main successor of the Palas by the 11th century. The Senas were a resurgent Hindu dynasty which ruled much of Bengal. The smaller Deva dynasty also ruled parts of the region. Ancient Chinese visitors like Xuanzang provided elaborate accounts of Bengal's cities and monastic institutions. Muslim trade with Bengal flourished after the fall of the Sasanian Empire and the Arab takeover of Persian trade routes. Much of this trade occurred with southeastern Bengal in areas east of the Meghna River. Bengal was probably used as a transit route to China by the earliest Muslims. Abbasid coins have been discovered in the archaeological ruins of Paharpur and Mainamati. A collection of Sasanian, Umayyad and Abbasid coins are preserved in the Bangladesh National Museum. Sultanate period. In 1204, the Ghurid general Muhammad bin Bakhtiyar Khalji began the Islamic conquest of Bengal. The fall of Lakhnauti was recounted by historians circa 1243. Lakhnauti was the capital of the Sena dynasty. According to historical accounts, Ghurid cavalry swept across the Gangetic plains towards Bengal. They entered the Bengali capital disguised as horse traders. Once inside the royal compound, Bakhtiyar and his horsemen swiftly overpowered the guards of the Sena king who had just sat down to eat a meal. The king then hastily fled to the forest with his followers. The overthrow of the Sena king has been described as a coup d'état, which "inaugurated an era, lasting over five centuries, during which most of Bengal was dominated by rulers professing the Islamic faith. In itself this was not exceptional, since from about this time until the eighteenth century, Muslim sovereigns ruled over most of the Indian subcontinent. What was exceptional, however, was that among India's interior provinces only in Bengal—a region approximately the size of England and Scotland combined—did a majority of the indigenous population adopt the religion of the ruling class, Islam". Bengal became a province of the Delhi Sultanate. A coin featuring a horseman was issued to celebrate the Muslim conquest of Lakhnauti with inscriptions in Sanskrit and Arabic. An abortive Islamic invasion of Tibet was also mounted by Bakhtiyar. Bengal was under the formal rule of the Delhi Sultanate for approximately 150 years. Delhi struggled to consolidate control over Bengal. Rebel governors often sought to assert autonomy or independence. Sultan Iltutmish re-established control over Bengal in 1225 after suppressing the rebels. Due to the considerable overland distance, Delhi's authority in Bengal was relatively weak. It was left to local governors to expand territory and bring new areas under Muslim rule, such as through the Conquest of Sylhet in 1303. In 1338, new rebellions sprung up in Bengal's three main towns. Governors in Lakhnauti, Satgaon and Sonargaon declared independence from Delhi. This allowed the ruler of Sonargaon, Fakhruddin Mubarak Shah, to annexe Chittagong to the Islamic administration. By 1352, the ruler of Satgaon, Shamsuddin Ilyas Shah, unified the region into an independent state. Ilyas Shah established his capital in Pandua. The new breakaway state emerged as the Bengal Sultanate, which developed into a territorial, mercantile and maritime empire. At the time, the Islamic world stretched from Muslim Spain in the west to Bengal in the east. The initial raids of Ilyas Shah saw the first Muslim army enter Nepal and stretched from Varanasi in the west to Orissa in the south to Assam in the east. The Delhi army continued to fend off the new Bengali army. The Bengal-Delhi War ended in 1359 when Delhi recognised the independence of Bengal. Ilyas Shah's son Sikandar Shah defeated Delhi Sultan Firuz Shah Tughluq during the Siege of Ekdala Fort. A subsequent peace treaty recognised Bengal's independence and Sikandar Shah was gifted a golden crown by the Sultan of Delhi. The ruler of Arakan sought refuge in Bengal during the reign of Ghiyasuddin Azam Shah. Jalaluddin Muhammad Shah later helped the Arakanese king to regain control of his throne in exchange for becoming a tributary state of the Bengal Sultanate. Bengali influence in Arakan persisted for 300 years. Bengal also helped the king of Tripura to regain control of his throne in exchange for becoming a tributary state. The ruler of the Jaunpur Sultanate also sought refuge in Bengal. The vassal states of Bengal included Arakan, Tripura, Chandradwip and Pratapgarh. At its peak, the Bengal Sultanate's territory included parts of Arakan, Assam, Bihar, Orissa, and Tripura. The Bengal Sultanate experienced its greatest military success under Alauddin Hussain Shah, who was proclaimed as the conqueror of Assam after his forces led by Shah Ismail Ghazi overthrew the Khen dynasty and annexed large parts of Assam. In maritime trade, the Bengal Sultanate benefited from Indian Ocean trade networks and emerged as a hub of re-exports. A giraffe was brought by African envoys from Malindi to Bengal's court and was later gifted to Imperial China. Ship-owing merchants acted as envoys of the Sultan while travelling to different regions in Asia and Africa. Many rich Bengali merchants lived in Malacca. Bengali ships transported embassies from Brunei, Aceh and Malacca to China. Bengal and the Maldives had a vast trade in shell currency. The Sultan of Bengal donated funds to build schools in the Hejaz region of Arabia. The five dynastic periods of the Bengal Sultanate spanned from the Ilyas Shahi dynasty, to a period of rule by Bengali converts, to the Hussain Shahi dynasty, to a period of rule by Abyssinian usurpers; an interruption by the Suri dynasty; and ended with the Karrani dynasty. The Battle of Raj Mahal and the capture of Daud Khan Karrani marked the end of the Bengal Sultanate during the reign of Mughal Emperor Akbar. In the late 16th century, a confederation called the Baro-Bhuyan resisted Mughal invasions in eastern Bengal. The Baro-Bhuyan included twelve Muslim and Hindu leaders of the Zamindars of Bengal. They were led by Isa Khan, a former prime minister of the Bengal Sultanate. By the 17th century, the Mughals were able to fully absorb the region to their empire. Mughal period. Mughal Bengal had the richest elite and was the wealthiest region in the subcontinent. Bengal's trade and wealth impressed the Mughals so much that it was described as the "Paradise of the Nations" by the Mughal Emperors. A new provincial capital was built in Dhaka. Members of the imperial family were appointed to positions in Mughal Bengal, including the position of governor ("subedar"). Dhaka became a centre of palace intrigue and politics. Some of the most prominent governors included Rajput general Man Singh I, Emperor Shah Jahan's son Prince Shah Shuja, Emperor Aurangzeb's son and later Mughal emperor Azam Shah, and the influential aristocrat Shaista Khan. During the tenure of Shaista Khan, the Portuguese and Arakanese were expelled from the port of Chittagong in 1666. Bengal became the eastern frontier of the Mughal administration. By the 18th century, Bengal became home to a semi-independent aristocracy led by the Nawabs of Bengal. Bengal premier Murshid Quli Khan managed to curtail the influence of the governor due to his rivalry with Prince Azam Shah. Khan controlled Bengal's finances since he was in charge of the treasury. He shifted the provincial capital from Dhaka to Murshidabad. In 1717, the Mughal court in Delhi recognised the hereditary monarchy of the Nawab of Bengal. The ruler was officially titled as the "Nawab of Bengal, Bihar and Orissa", as the Nawab ruled over the three regions in the eastern subcontinent. The Nawabs began issuing their own coins but continued to pledge nominal allegiance to the Mughal emperor. The wealth of Bengal was vital for the Mughal court because Delhi received its biggest share of revenue from the Nawab's court. The Nawabs presided over a period of unprecedented economic growth and prosperity, including an era of growing organisation in textiles, banking, a military-industrial complex, the production of fine quality handicrafts, and other trades. A process of proto-industrialisation was underway. Under the Nawabs, the streets of Bengali cities were filled with brokers, workers, peons, naibs, wakils, and ordinary traders. The Nawab's state was a major exporter of Bengal muslin, silk, gunpowder and saltpetre. The Nawabs also permitted European trading companies to operate in Bengal, including the British East India Company, the French East India Company, the Danish East India Company, the Austrian East India Company, the Ostend Company, and the Dutch East India Company. The Nawabs were also suspicious of the growing influence of these companies. Under Mughal rule, Bengal was a centre of the worldwide muslin and silk trades. During the Mughal era, the most important centre of cotton production was Bengal, particularly around its capital city of Dhaka, leading to muslin being called "daka" in distant markets such as Central Asia. Domestically, much of India depended on Bengali products such as rice, silks and cotton textiles. Overseas, Europeans depended on Bengali products such as cotton textiles, silks and opium; Bengal accounted for 40% of Dutch imports from Asia, for example, including more than 50% of textiles and around 80% of silks. From Bengal, saltpetre was also shipped to Europe, opium was sold in Indonesia, raw silk was exported to Japan and the Netherlands, cotton and silk textiles were exported to Europe, Indonesia, and Japan, cotton cloth was exported to the Americas and the Indian Ocean. Bengal also had a large shipbuilding industry. In terms of shipbuilding tonnage during the 16th–18th centuries, economic historian Indrajit Ray estimates the annual output of Bengal at 223,250 tons, compared with 23,061 tons produced in nineteen colonies in North America from 1769 to 1771. Since the 16th century, European traders traversed the sea routes to Bengal, following the Portuguese conquests of Malacca and Goa. The Portuguese established a settlement in Chittagong with permission from the Bengal Sultanate in 1528 but were later expelled by the Mughals in 1666. In the 18th century, the Mughal Court rapidly disintegrated due to Nader Shah's invasion and internal rebellions, allowing European colonial powers to set up trading posts across the territory. The British East India Company eventually emerged as the foremost military power in the region; and defeated the last independent Nawab of Bengal at the Battle of Plassey in 1757. Colonial era (1757–1947). The British East India Company began influencing and controlling the Nawab of Bengal from 1757 after the Battle of Plassey, thus signalling the start of British influence in India. British control of Bengal increased between 1757 and 1793 while the Nawab was reduced to a puppet figure. with the Presidency of Fort William asserting greater control over the entire province of Bengal and neighbouring territories. Calcutta was named the capital of British territories in India in 1772. The presidency was run by a military-civil administration, including the Bengal Army, and had the world's sixth earliest railway network. Between 1833 and 1854, the Governor of Bengal was concurrently the Governor-General of India for many years. Great Bengal famines struck several times during colonial rule (notably the Great Bengal famine of 1770 and Bengal famine of 1943). Under British rule, Bengal experienced the deindustrialisation of its pre-colonial economy. Company policies led to the deindustrialisation of Bengal's textile industry. The capital amassed by the East India Company in Bengal was invested in the emerging Industrial Revolution in Great Britain, in industries such as textile manufacturing. Economic mismanagement, alongside drought and a smallpox epidemic, directly led to the Great Bengal famine of 1770, which is estimated to have caused the deaths of between 1 million and 10 million people. In 1862, the Bengal Legislative Council was set up as the first modern legislature in India. Elected representation was gradually introduced during the early 20th century, including with the Morley-Minto reforms and the system of dyarchy. In 1937, the council became the upper chamber of the Bengali legislature while the Bengal Legislative Assembly was created. Between 1937 and 1947, the chief executive of the government was the Prime Minister of Bengal. The Bengal Presidency was the largest administrative unit in the British Empire. At its height, it covered large parts of present-day India, Pakistan, Bangladesh, Burma, Malaysia, and Singapore. In 1830, the British Straits Settlements on the coast of the Malacca Straits was made a residency of Bengal. The area included the erstwhile Prince of Wales Island, Province Wellesley, Malacca and Singapore. In 1867, Penang, Singapore and Malacca were separated from Bengal into the Straits Settlements. British Burma became a province of India and a later a Crown colony in itself. Western areas, including the Ceded and Conquered Provinces and The Punjab, were further reorganised. Northeastern areas became Colonial Assam. In 1876, about 200,000 people were killed in Bengal by the Great Backerganj Cyclone of 1876 in the Barisal region. About 50 million were killed in Bengal due to massive plague outbreaks and famines which happened in 1895 to 1920, mostly in western Bengal. The Indian Rebellion of 1857 was initiated on the outskirts of Calcutta, and spread to Dhaka, Chittagong, Jalpaiguri, Sylhet and Agartala, in solidarity with revolts in North India. The failure of the rebellion led to the abolition of the Company Rule in India and establishment of direct rule over India by the British, commonly referred to as the British Raj. The late 19th and early 20th century Bengal Renaissance had a great impact on the cultural and economic life of Bengal and started a great advance in the literature and science of Bengal. Between 1905 and 1911, an abortive attempt was made to divide the province of Bengal into two: Bengal proper and the short-lived province of Eastern Bengal and Assam where the All India Muslim League was founded. In 1911, the Bengali poet and polymath Rabindranath Tagore became Asia's first Nobel laureate when he won the Nobel Prize in Literature. Bengal played a major role in the Indian independence movement, in which revolutionary groups were dominant. Armed attempts to overthrow the British Raj began with the rebellion of Titumir, and reached a climax when Subhas Chandra Bose led the Indian National Army against the British. Bengal was also central in the rising political awareness of the Muslim population—the All-India Muslim League was established in Dhaka in 1906. The Muslim homeland movement pushed for a sovereign state in eastern India with the Lahore Resolution in 1943. Hindu nationalism was also strong in Bengal, which was home to groups like the Hindu Mahasabha. In spite of a last-ditch effort by politicians Huseyn Shaheed Suhrawardy, Sarat Chandra Bose to form a United Bengal, when India gained independence in 1947, Bengal was partitioned along religious lines. The western joined India (and was named West Bengal) while the eastern part joined Pakistan as a province called East Bengal (later renamed East Pakistan, giving rise to Bangladesh in 1971). The circumstances of partition were bloody, with widespread religious riots in Bengal. Partition of Bengal (1947). On 27 April 1947, the last Prime Minister of Bengal Huseyn Shaheed Suhrawardy held a press conference in New Delhi where he outlined his vision for an independent Bengal. Suhrawardy said "Let us pause for a moment to consider what Bengal can be if it remains united. It will be a great country, indeed the richest and the most prosperous in India capable of giving to its people a high standard of living, where a great people will be able to rise to the fullest height of their stature, a land that will truly be plentiful. It will be rich in agriculture, rich in industry and commerce and in course of time it will be one of the powerful and progressive states of the world. If Bengal remains united this will be no dream, no fantasy". On 2 June 1947, British Prime Minister Clement Attlee told the US Ambassador to the United Kingdom that there was a "distinct possibility Bengal might decide against partition and against joining either Hindustan or Pakistan". On 3 June 1947, the Mountbatten Plan outlined the partition of British India. On 20 June, the Bengal Legislative Assembly met to decide on the partition of Bengal. At the preliminary joint meeting, it was decided (126 votes to 90) that if the province remained united, it should join the Constituent Assembly of Pakistan. At a separate meeting of legislators from West Bengal, it was decided (58 votes to 21) that the province should be partitioned and West Bengal should join the Constituent Assembly of India. At another meeting of legislators from East Bengal, it was decided (106 votes to 35) that the province should not be partitioned and (107 votes to 34) that East Bengal should join the Constituent Assembly of Pakistan if Bengal was partitioned. On 6 July, the Sylhet district of Assam voted in a referendum to join East Bengal. The English barrister Cyril Radcliffe was instructed to draw the borders of Pakistan and India. The Radcliffe Line created the boundary between the Dominion of India and the Dominion of Pakistan, which later became the Bangladesh-India border. The Radcliffe Line awarded two-thirds of Bengal as the eastern wing of Pakistan, although the historic Bengali capitals of Gaur, Pandua, Murshidabad and Calcutta fell on the Indian side close to the border with Pakistan. Dhaka's status as a capital was also restored. Geography. Most of the Bengal region lies in the Ganges-Brahmaputra delta, but there are highlands in its north, northeast and southeast. The Ganges Delta arises from the confluence of the rivers Ganges, Brahmaputra, and Meghna rivers and their respective tributaries. The total area of Bengal is —West Bengal is and Bangladesh . The flat and fertile Bangladesh Plain dominates the geography of Bangladesh. The Chittagong Hill Tracts and Sylhet region are home to most of the mountains in Bangladesh. Most parts of Bangladesh are within above the sea level, and it is believed that about 10% of the land would be flooded if the sea level were to rise by . Because of this low elevation, much of this region is exceptionally vulnerable to seasonal flooding due to monsoons. The highest point in Bangladesh is in Mowdok range at . A major part of the coastline comprises a marshy jungle, the Sundarbans, the largest mangrove forest in the world and home to diverse flora and fauna, including the royal Bengal tiger. In 1997, this region was declared endangered. West Bengal is on the eastern bottleneck of India, stretching from the Himalayas in the north to the Bay of Bengal in the south. The state has a total area of . The Darjeeling Himalayan hill region in the northern extreme of the state belongs to the eastern Himalaya. This region contains Sandakfu ()—the highest peak of the state. The narrow Terai region separates this region from the plains, which in turn transitions into the Ganges delta towards the south. The Rarh region intervenes between the Ganges delta in the east and the western plateau and high lands. A small coastal region is on the extreme south, while the Sundarbans mangrove forests form a remarkable geographical landmark at the Ganges delta. At least nine districts in West Bengal and 42 districts in Bangladesh have arsenic levels in groundwater above the World Health Organization maximum permissible limit of 50 μg/L or 50 parts per billion and the untreated water is unfit for human consumption. The water causes arsenicosis, skin cancer and various other complications in the body. Geographic distinctions. North Bengal. North Bengal is a term used for the north-western part of Bangladesh and northern part of West Bengal. The Bangladeshi part comprises Rajshahi Division and Rangpur Division. Generally, it is the area lying west of Jamuna River and north of Padma River, and includes the Barind Tract. Politically, West Bengal's part comprises Jalpaiguri Division and most of Malda division (except Murshidabad district) together and Bihar's parts include Kishanganj district. Darjeeling Hilly are also part of North Bengal. The people of Jaipaiguri, Alipurduar and Cooch Behar usually identify themselves as North Bengali. North Bengal is divided into Terai and Dooars regions. North Bengal is also noted for its rich cultural heritage, including two UNESCO World Heritage Sites. Aside from the Bengali majority, North Bengal is home to many other communities including Nepalis, Santhal people, Lepchas and Rajbongshis. Northeast Bengal. Northeast Bengal refers to the Sylhet region, which today comprises the Sylhet Division of Bangladesh and Karimganj district in the Indian state of Assam. The region is famous for its fertile land terrain, many rivers, extensive tea plantations, rainforests and wetlands. The Brahmaputra and Barak river are the geographic markers of the area. The city of Sylhet is its largest urban centre, and the region is known for its unique regional Sylheti language. The ancient name of the region is Srihatta and Nasratshahi. The region was ruled by the Kamarupa and Harikela kingdoms as well as the Bengal Sultanate. It later became a district of the Mughal Empire. Alongside the predominant Bengali population resides a small Garo, Bishnupriya Manipuri, Khasia and other tribal minorities. The region is the crossroads of Bengal and northeast India. Central Bengal. Central Bengal refers to the Dhaka Division of Bangladesh. It includes the elevated Madhupur tract with a large Sal tree forest. The Padma River cuts through the southern part of the region, separating the greater Faridpur region. In the north lies the greater Mymensingh and Tangail regions. South Bengal. South Bengal covers the southwestern Bangladesh and the southern part of the Indian state of West Bengal. The Bangladeshi part includes Khulna Division, Barisal Division and the proposed Faridpur Division The part of South Bengal of West Bengal includes Presidency division, Burdwan division and Medinipur division. The Sundarbans, a major biodiversity hotspot, is located in South Bengal. Bangladesh hosts 60% of the forest, with the remainder in India. Southeast Bengal. Southeast Bengal refers to the hilly-coastal Chittagonian-speaking and coastal Bengali-speaking areas of Chittagong Division in southeastern Bangladesh. The region is noted for its thalassocratic and seafaring heritage. The area was dominated by the Bengali Harikela and Samatata kingdoms in antiquity. It was known to Arab traders as "Samandar" in the 9th century. During the medieval period, the region was ruled by the Chandra dynasty, the sultanate of Bengal, the kingdom of Tripura, the kingdom of Mrauk U, the Portuguese Empire and the Mughal Empire, prior to the advent of British rule. The Chittagonian language, a sister of Bengali is prevalent in coastal areas of southeast Bengal. Along with its Bengali population, it is also home to Tibeto-Burman ethnic groups, including the Chakma, Marma, Tanchangya and Bawm peoples. Southeast Bengal is considered a bridge to Southeast Asia and the northern parts of Arakan are also historically considered to be a part of it. Places of interest. There are four World Heritage Sites in the region, including the Sundarbans, the Somapura Mahavihara, the Mosque City of Bagerhat and the Darjeeling Himalayan Railway. Other prominent places include the Bishnupur, Bankura temple city, the Adina Mosque, the Caravanserai Mosque, numerous zamindar palaces (like Ahsan Manzil and Cooch Behar Palace), the Lalbagh Fort, the Great Caravanserai ruins, the Shaista Khan Caravanserai ruins, the Kolkata Victoria Memorial, the Dhaka Parliament Building, archaeologically excavated ancient fort cities in Mahasthangarh, Mainamati, Chandraketugarh and Wari-Bateshwar, the Jaldapara National Park, the Lawachara National Park, the Teknaf Game Reserve and the Chittagong Hill Tracts. Cox's Bazar in southeastern Bangladesh is home to the longest natural sea beach in the world with an unbroken length of 120 km (75 mi). It is also a growing surfing destination. St. Martin's Island, off the coast of Chittagong Division, is home to the sole coral reef in Bengal. Other regions. Bengal was a regional power of the Indian subcontinent. The administrative jurisdiction of Bengal historically extended beyond the territory of Bengal proper. In the 9th century, the Pala Empire of Bengal ruled large parts of northern India. The Bengal Sultanate controlled Bengal, Assam, Arakan, Bihar and Orissa at different periods in history. In Mughal Bengal, the Nawab of Bengal had a jurisdiction covering Bengal, Bihar and Orissa. Bengal's administrative jurisdiction reached its greatest extent under the British Empire, when the Bengal Presidency extended from the Straits of Malacca in the east to the Khyber Pass in the west. In the late-19th and early-20th centuries, administrative reorganisation drastically reduced the territory of Bengal. Several regions bordering Bengal proper continue to have high levels of Bengali influence. The Indian state of Tripura has a Bengali majority population. Bengali influence is also prevalent in the Indian regions of Assam, Meghalaya, Bihar and the Andaman and Nicobar Islands; as well as in Myanmar's Rakhine State. Arakan. Arakan (now Rakhine State, Myanmar) has historically been under strong Bengali influence. Since antiquity, Bengal has influenced the culture of Arakan. The ancient Bengali script was used in Arakan. An Arakanese inscription recorded the reign of the Bengali Candra dynasty. Paul Wheatley described the "Indianization" of Arakan. According to Pamela Gutman, "Arakan was ruled by kings who adopted Indian titles and traditions to suit their own environment. Indian Brahmins conducted royal ceremonies, Buddhist monks spread their teachings, traders came and went and artists and architects used Indian models for inspiration. In the later period, there was also influence from the Islamic courts of Bengal and Delhi". Arakan emerged as a vassal state of the Bengal Sultanate. It later became an independent kingdom. The royal court and culture of the Kingdom of Mrauk U was heavily influenced by Bengal. Bengali Muslims served in the royal court as ministers and military commanders. Bengali Hindus and Bengali Buddhists served as priests. Some of the most important poets of medieval Bengali literature lived in Arakan, including Alaol and Daulat Qazi. In 1660, Prince Shah Shuja, the governor of Mughal Bengal and a pretender of the Peacock Throne of India, was forced to seek asylum in Arakan. Bengali influence in the Arakanese royal court persisted until Burmese annexation in the 18th century. The modern-day Rohingya population is a legacy of Bengal's influence on Arakan. The Rohingya genocide resulted in the displacement of over a million people between 2016 and 2017, with many being uprooted from their homes in Rakhine State. Assam. The Indian state of Assam shares many cultural similarities with Bengal. The Assamese language uses the same script as the Bengali language. The Barak Valley has a Bengali-speaking majority population. During the Partition of India, Assam was also partitioned along with Bengal. The Sylhet Division joined East Bengal in Pakistan, with the exception of Karimganj which joined Indian Assam. Previously, East Bengal and Assam were part of a single province called Eastern Bengal and Assam between 1905 and 1912 under the British Raj. Assam and Bengal were often part of the same kingdoms, including Kamarupa, Gauda and Kamata. Large parts of Assam were annexed by Alauddin Hussain Shah during the Bengal Sultanate. Assam was one of the few regions in the subcontinent to successfully resist Mughal expansion and never fell completely under Mughal rule. Andaman and Nicobar Islands. Bengali is the most spoken language among the population of the Andaman and Nicobar Islands, a strategically important archipelago which is controlled by India as a federal territory. The islands were once used as a British penal colony. During World War II, the islands were seized by the Japanese and controlled by the Provisional Government of Free India. Anti-British leader Subhash Chandra Bose visited and renamed the islands. Between 1949 and 1971, the Indian government resettled many Bengali Hindus in the Andaman and Nicobar Islands. Bihar. In antiquity, Bihar and Bengal were often part of the same kingdoms. The ancient region of Magadha covered both Bihar and Bengal. Magadha was the birthplace or bastion of several pan-Indian empires, including the Mauryan Empire, the Gupta Empire and the Pala Empire. Bengal, Bihar and Orissa together formed a single province under the Mughal Empire. The Nawab of Bengal was styled as the Nawab of Bengal, Bihar and Orissa. Chittagong Hill Tracts. The Chittagong Hill Tracts is the southeastern frontier of Bangladesh. Its indigenous population includes Tibeto-Burman ethnicities, including the Chakma people, Bawm people and Mro people among others. The region was historically ruled by tribal chieftains of the Chakma Circle and Bohmong Circle. In 1713, the Chakma Raja signed a treaty with Mughal Bengal after obtaining permission from Emperor Farrukhsiyar for trade with the plains of Chittagong. Like the kings of Arakan, the Chakma Circle began to fashion themselves using Mughal nomenclatures and titles. They initially resisted the Permanent Settlement and the activities of the East India Company. The tribal royal families of the region came under heavy Bengali influence. The Chakma queen Benita Roy was a friend of Rabindranath Tagore. The region was governed by the Chittagong Hill Tracts manual under colonial rule. The manual was significantly amended after the end of British rule; and the region became fully integrated with Bangladesh. Malay Archipelago. Through trade, settlements and the exchange of ideas; parts of Maritime Southeast Asia became linked with Bengal. Language, literature, art, governing systems, religions and philosophies in ancient Sumatra and Java were influenced by Bengal. Hindu-Buddhist kingdoms in Southeast Asia depended on the Bay of Bengal for trade and ideas. Islam in Southeast Asia also spread through the Bay of Bengal, which was a bridge between the Malay Archipelago and Indo-Islamic states of the Indian subcontinent. A large number of wealthy merchants from Bengal were based in Malacca. Bengali ships were the largest ships in the waters of the Malay Archipelago during the 15th century. Between 1830 and 1867, the ports of Singapore and Malacca, the island of Penang, and a portion of the Malay Peninsula were ruled under the jurisdiction of the Bengal Presidency of the British Empire. These areas were known as the Straits Settlements, which was separated from the Bengal Presidency and converted into a Crown colony in 1867. Meghalaya. The Indian state of Meghalaya historically came under the influence of Shah Jalal, a Muslim missionary and conqueror from Sylhet. During British rule, the city of Shillong was the summer capital of Eastern Bengal and Assam (modern Bangladesh and Northeast India). Shillong boasted the highest per capita income in British India. North India. The ancient Mauryan, Gupta and Pala empires of the Magadha region (Bihar and Bengal) extended into northern India. The westernmost border of the Bengal Sultanate extended towards Varanasi and Jaunpur. In the 19th century, Punjab and the Ceded and Conquered Provinces formed the western extent of the Bengal Presidency. According to the British historian Rosie Llewellyn-Jones, "The Bengal Presidency, an administrative division introduced by the East India Company, would later include not only the whole of northern India up to the Khyber Pass on the north-west frontier with Afghanistan, but would spread eastwards to Burma and Singapore as well". Odisha. Odisha, previously known as Orissa, has a significant Bengali minority. Historically, the region has faced invasions from Bengal, including an invasion by Shamsuddin Ilyas Shah. Parts of the region were ruled by the Bengal Sultanate and Mughal Bengal. The Nawab of Bengal was styled as the "Nawab of Bengal, Bihar and Orissa" because the Nawab was granted jurisdiction over Orissa by the Mughal Emperor. Tibet. During the Pala dynasty, Tibet received missionaries from Bengal who influenced the emergence of Tibetan Buddhism. One of the most notable missionaries was Atisa. During the 13th century, Tibet experienced an Islamic invasion by the forces of Bakhtiyar Khalji, the Muslim conqueror of Bengal. Tripura. The princely state of Tripura was ruled by the Manikya dynasty until the 1949 Tripura Merger Agreement. Tripura was historically a vassal state of Bengal. After assuming the throne with military support from the Bengal Sultanate in 1464, Ratna Manikya I introduced administrative reforms inspired by the government of Bengal. The Tripura kings requested Sultan Barbak Shah to provide manpower for developing the administration of Tripura. As a result, Bengali Hindu bureaucrats, cultivators and artisans began settling in Tripura. Today, the Indian state of Tripura has a Bengali-majority population. Modern Tripura is a gateway for trade and transport links between Bangladesh and Northeast India. In Bengali culture, the celebrated singer S. D. Burman was a member of the Tripura royal family. Flora and fauna. The flat Bengal Plain, which covers most of Bangladesh and West Bengal, is one of the most fertile areas on Earth, with lush vegetation and farmland dominating its landscape. Bengali villages are buried among groves of mango, jackfruit, betel nut and date palm. Rice, jute, mustard and sugarcane plantations are a common sight. Water bodies and wetlands provide a habitat for many aquatic plants in the Ganges-Brahmaputra delta. The northern part of the region features Himalayan foothills ("Dooars") with densely wooded Sal and other tropical evergreen trees. Above an elevation of 1,000 metres (3,300 ft), the forest becomes predominantly subtropical, with a predominance of temperate-forest trees such as oaks, conifers and rhododendrons. Sal woodland is also found across central Bangladesh, particularly in the Bhawal National Park. The Lawachara National Park is a rainforest in northeastern Bangladesh. The Chittagong Hill Tracts in southeastern Bangladesh is noted for its high degree of biodiversity. The littoral Sundarbans in the southwestern part of Bengal is the largest mangrove forest in the world and a UNESCO World Heritage Site. The region has over 89 species of mammals, 628 species of birds and numerous species of fish. For Bangladesh, the water lily, the oriental magpie-robin, the hilsa and mango tree are national symbols. For West Bengal, the white-throated kingfisher, the chatim tree and the night-flowering jasmine are state symbols. The Bengal tiger is the national animal of Bangladesh and India. The fishing cat is the state animal of West Bengal. Politics. Today, the region of Bengal proper is divided between the sovereign state of the People's Republic of Bangladesh and the Indian state of West Bengal. The Bengali-speaking Barak Valley forms part of the Indian state of Assam. The Indian state of Tripura has a Bengali-speaking majority and was formerly the princely state of Hill Tipperah. In the Bay of Bengal, St. Martin's Island is governed by Bangladesh; while the Andaman and Nicobar Islands has a plurality of Bengali speakers and is governed by India's federal government as a union territory. Bangladeshi Republic. The state of Bangladesh is a parliamentary republic based on the Westminster system, with a written constitution and a President elected by parliament for mostly ceremonial purposes. The government is headed by a Prime Minister, who is appointed by the President from among the popularly elected 300 Members of Parliament in the Jatiyo Sangshad, the national parliament. The Prime Minister is traditionally the leader of the single largest party in the Jatiyo Sangshad. Under the constitution, while recognising Islam as the country's established religion, the constitution grants freedom of religion to non-Muslims. Between 1975 and 1990, Bangladesh had a presidential system of government. Since the 1990s, it was administered by non-political technocratic caretaker governments on four occasions, the last being under military-backed emergency rule in 2007 and 2008. The Awami League and the Bangladesh Nationalist Party (BNP) were the two most dominant political parties in Bangladesh until the July Revolution of 2024, which led to the ousting of Sheikh Hasina following mass protests and a nationwide uprising. In the aftermath, a non-partisan interim government was formed under Nobel laureate Muhammad Yunus to restore democratic governance and oversee institutional reforms. Much like the technocratic caretaker governments between 1990 and 2008, this interim administration pledged neutrality and began preparations for constitutional changes and future elections. Bangladesh is a member of the UN, WTO, IMF, the World Bank, ADB, OIC, IDB, SAARC, BIMSTEC and the IMCTC. Bangladesh has achieved significant strides in human development compared to its neighbours. Indian Bengal. West Bengal is a constituent state of the Republic of India, with local executives and assemblies- features shared with other states in the Indian federal system. The president of India appoints a governor as the ceremonial representative of the union government. The governor appoints the chief minister on the nomination of the legislative assembly. The chief minister is the traditionally the leader of the party or coalition with most seats in the assembly. President's rule is often imposed in Indian states as a direct intervention of the union government led by the prime minister of India. The Bengali-speaking zone of India carries 48 seats in the lower house of India, Lok Sabha. Each state has popularly elected members in the Indian lower house of parliament, the Lok Sabha. Each state nominates members to the Indian upper house of parliament, the Rajya Sabha. The state legislative assemblies also play a key role in electing the ceremonial president of India. The former president of India, Pranab Mukherjee, was a native of West Bengal and a leader of the Indian National Congress. The former leader of opposition of India, Adhir Ranjan Chowdhury is from West Bengal. He has been elected from Baharampur Lok Sabha constituency. The major political forces in the Bengali-speaking zone of India are the Left Front and the Trinamool Congress, the Indian National Congress and the Bharatiya Janata Party. The Bengali-speaking zone of India is considered stronghold for Communism in India. Bengalis are known not to vote on communal lines but in recent years this conception has how changed. The West Bengal based Trinamool Congress is now the third largest party of India in terms of number of MP or MLA after the Bharatiya Janata Party and the Indian National Congress. Earlier the Communist Party of India (Marxist) held this position. Crossborder relations. India and Bangladesh are the world's first and eighth most populous countries respectively. Bangladesh-India relations began on a high note in 1971 when India played a major role in the liberation of Bangladesh, with the Indian Bengali populace and media providing overwhelming support to the independence movement in the former East Pakistan. The two countries had a twenty five-year friendship treaty between 1972 and 1996. However, differences over river sharing, border security and access to trade have long plagued the relationship. In more recent years, a consensus has evolved in both countries on the importance of developing good relations, as well as a strategic partnership in South Asia and beyond. Commercial, cultural and defence co-operation have expanded since 2010, when Prime Ministers Sheikh Hasina and Manmohan Singh pledged to reinvigorate ties. The Bangladesh High Commission in New Delhi operates a Deputy High Commission in Kolkata and a consular office in Agartala. India has a High Commission in Dhaka with consulates in Chittagong and Rajshahi. Frequent international air, bus and rail services connect major cities in Bangladesh and Indian Bengal, particularly the three largest cities- Dhaka, Kolkata and Chittagong. Undocumented immigration of Bangladeshi workers is a controversial issue championed by right-wing nationalist parties in India but finds little sympathy in West Bengal. India has since fenced the border which has been criticised by Bangladesh. Economy. The Ganges Delta provided advantages of fertile soil, ample water, and an abundance of fish, wildlife, and fruit. Living standards for Bengal's elite were relatively better than other parts of the Indian subcontinent. Between 400 and 1200, Bengal had a well-developed economy in terms of land ownership, agriculture, livestock, shipping, trade, commerce, taxation, and banking. The apparent vibrancy of the Bengal economy in the beginning of the 15th century is attributed to the end of tribute payments to the Delhi Sultanate, which ceased after the creation of the Bengal Sultanate and stopped the outflow of wealth. Ma Huan's travelogue recorded a booming shipbuilding industry and significant international trade in Bengal. In 1338, Ibn Battuta noticed that the silver taka was the most popular currency in the region instead of the Islamic dinar. In 1415, members of Admiral Zheng He's entourage also noticed the dominance of the taka. The currency was the most important symbol of sovereignty for the Sultan of Bengal. The Sultanate of Bengal established an estimated 27 mints in provincial capitals across the kingdom. These provincial capitals were known as Mint Towns. These Mint Towns formed an integral aspect of governance and administration in Bengal. The taka continued to be issued in Mughal Bengal, which inherited the sultanate's legacy. As Bengal became more prosperous and integrated into the world economy under Mughal rule, the taka replaced shell currency in rural areas and became the standardised legal tender. It was also used in commerce with the Dutch East India Company, the French East India Company, the Danish East India Company and the British East India Company. Under Mughal rule, Bengal was the centre of the worldwide muslin trade. The muslin trade in Bengal was patronised by the Mughal imperial court. Muslin from Bengal was worn by aristocratic ladies in courts as far away as Europe, Persia and Central Asia. The treasury of the Nawab of Bengal was the biggest source of revenue for the imperial Mughal court in Delhi. Bengal had a large shipbuilding industry. The shipbuilding output of Bengal during the 16th and 17th centuries stood at 223,250tons annually, which was higher than the volume of shipbuilding in the nineteen colonies of North America between 1769 and 1771. Historically, Bengal has been the industrial leader of the subcontinent. Mughal Bengal saw the emergence of a proto-industrial economy backed up by textiles and gunpowder. The organised early modern economy flourished till the beginning of British rule in the mid 18th-century, when the region underwent radical and revolutionary changes in government, trade, and regulation. The British displaced the indigenous ruling class and transferred much of the region's wealth back to the colonial metropole in Britain. In the 19th century, the British began investing in railways and limited industrialisation. However, the Bengali economy was dominated by trade in raw materials during much of the colonial period, particularly the jute trade. The partition of India changed the economic geography of the region. Calcutta in West Bengal inherited a thriving industrial base from the colonial period, particularly in terms of jute processing. East Pakistan soon developed its industrial base, including the world's largest jute mill. In 1972, the newly independent government of Bangladesh nationalised 580 industrial plants. These industries were later privatised in the late 1970s as Bangladesh moved towards a market-oriented economy. Liberal reforms in 1991 paved the way for a major expansion of Bangladesh's private sector industry, including in telecoms, natural gas, textiles, pharmaceuticals, ceramics, steel and shipbuilding. In 2022, Bangladesh was the second largest economy in South Asia after India. The region is one of the largest rice producing areas in the world, with West Bengal being India's largest rice producer and Bangladesh being the world's fourth largest rice producer. Three Bengali economists have been Nobel laureates, including Amartya Sen and Abhijit Banerjee who won the Nobel Memorial Prize in Economics and Muhammad Yunus who won the Nobel Peace Prize. Intra-Bengal trade. Bangladesh and India are the largest trading partners in South Asia, with two-way trade valued at an estimated US$16 billion. Most of this trade relationship is centred on some of the world's busiest land ports on the Bangladesh-India border. The Bangladesh Bhutan India Nepal Initiative seeks to boost trade through a Regional Motor Vehicles Agreement. Demographics. The Bengal region is one of the most densely populated areas in the world. With a population of 300 million, Bengalis are the third largest ethnic group in the world after the Han Chinese and Arabs. According to provisional results of 2011 Bangladesh census, the population of Bangladesh was 149,772,364; however, CIA's "The World Factbook" gives 163,654,860 as its population in a July 2013 estimate. According to the provisional results of the 2011 Indian national census, West Bengal has a population of 91,347,736. "So, the Bengal region, , has at least 241.1 million people. This figures give a population density of 1003.9/km2; making it among the most densely populated areas in the world. Bengali is the main language spoken in Bengal. Many phonological, lexical, and structural differences from the standard variety occur in peripheral varieties of Bengali across the region. Other regional languages closely related to Bengali include Sylheti, Chittagonian, Chakma, Rangpuri/Rajbangshi, Hajong, Rohingya, and Tangchangya. According to the 2011 Indian census, 18% of the Bengali-speakers are bilingual of whom half can speak Hindi, and 5% are trilingual. In general, Bengalis are followers of Islam, Hinduism, Christianity and Buddhism with a significant number are Irreligious. In addition, several minority ethnolinguistic groups are native to the region. These include speakers of other Indo-Aryan languages (e.g., Bishnupriya Manipuri, Oraon Sadri, various Bihari languages), Tibeto-Burman languages (e.g., A'Tong, Chak, Koch, Garo, Megam, Meitei (officially called "Manipuri"), Mizo, Mru, Pangkhua, Rakhine/Marma, Kok Borok, Riang, Tippera, Usoi, various Chin languages), Austroasiatic languages (e.g., Khasi, Koda, Mundari, Pnar, Santali, War), and Dravidian languages (e.g., Kurukh, Sauria Paharia). Life expectancy is around 72.49 years for Bangladesh and 70.2 for West Bengal. In terms of literacy, West Bengal leads with 77% literacy rate, in Bangladesh the rate is approximately 72.9%. The level of poverty in West Bengal is at 19.98%, while in Bangladesh it stands at 12.9% West Bengal has one of the lowest total fertility rates in India. West Bengal's TFR of 1.6 roughly equals that of Canada. Major cities. The Bengal region is home to the some of major urban areas of the world, Dhaka is the 4th largest urban areas of the world. Kolkata is 17th largest urban area. Other major cities in West Bengal. Other important cities of West Bengal region such as Howrah, Kalyani, Basirhat, Kharagpur, Durgapur, Coochbehar, Malda, Chandannagar, Bardhaman, Darjeeling etc. Other major cities in Bangladesh. Other important cities in Bangladesh such as Sylhet, Khulna, Jessore, Rajshahi, Rangpur, Cox's Bazar etc. Culture. Language. The Bengali language developed between the 7th and 10th centuries from Apabhraṃśa and Magadhi Prakrit. It is written using the indigenous Bengali alphabet, a descendant of the ancient Brahmi script. Bengali is the 5th most spoken language in the world. It is an eastern Indo-Aryan language and one of the easternmost branches of the Indo-European language family. It is part of the Bengali-Assamese languages. Bengali has greatly influenced other languages in the region, including Odia, Assamese, Chakma, Nepali and Rohingya. It is the sole state language of Bangladesh and the second most spoken language in India. It is also the seventh most spoken language by total number of speakers in the world. Bengali binds together a culturally diverse region and is an important contributor to regional identity. The 1952 Bengali Language Movement in East Pakistan is commemorated by UNESCO as International Mother Language Day, as part of global efforts to preserve linguistic identity. Currency. In both Bangladesh and West Bengal, currency is commonly denominated as taka. The Bangladesh taka is an official standard bearer of this tradition, while the Indian rupee is also written as taka in Bengali script on all of its banknotes. The history of the taka dates back centuries. Bengal was home one of the world's earliest coin currencies in the first millennium BCE. Under the Delhi Sultanate, the taka was introduced by Muhammad bin Tughluq in 1329. Bengal became the stronghold of the taka. The silver currency was the most important symbol of sovereignty of the Sultanate of Bengal. It was traded on the Silk Road and replicated in Nepal and China's Tibetan protectorate. The Pakistani rupee was scripted in Bengali as taka on its banknotes until Bangladesh's creation in 1971. Literature. Bengali literature has a rich heritage. It has a history stretching back to the 3rd century BCE, when the main language was Sanskrit written in the brahmi script. The Bengali language and script evolved from Magadhi Prakrit. Bengal has a long tradition in folk literature, evidenced by the "Chôrjapôdô", "Mangalkavya", "Shreekrishna Kirtana", "Maimansingha Gitika" or "Thakurmar Jhuli". Bengali literature in the medieval age was often either religious (e.g. Chandidas), or adaptations from other languages (e.g. Alaol). During the Bengal Renaissance of the nineteenth and twentieth centuries, Bengali literature was modernised through the works of authors such as Michael Madhusudan Dutta, Ishwar Chandra Vidyasagar, Bankim Chandra Chattopadhyay, Rabindranath Tagore, Sarat Chandra Chattopadhyay, Kazi Nazrul Islam, Satyendranath Dutta, Begum Rokeya and Jibanananda Das. In the 20th century, prominent modern Bengali writers included Syed Mujtaba Ali, Jasimuddin, Manik Bandopadhyay, Tarasankar Bandyopadhyay, Bibhutibhushan Bandyopadhyay, Buddhadeb Bose, Sunil Gangopadhyay and Humayun Ahmed. Prominent contemporary Bengali writers in English include Amitav Ghosh, Tahmima Anam, Jhumpa Lahiri and Zia Haider Rahman among others. Personification. The Bangamata is a female personification of Bengal which was created during the Bengali Renaissance and later adopted by the Bengali nationalists. Hindu nationalists adopted a modified Bharat Mata as a national personification of India. The Mother Bengal represents not only biological motherness but its attributed characteristics as well – protection, never ending love, consolation, care, the beginning and the end of life. In Amar Sonar Bangla, the national anthem of Bangladesh, Rabindranath Tagore has used the word "Maa" (Mother) numerous times to refer to the motherland i.e. Bengal. Art. The Pala-Sena School of Art developed in Bengal between the 8th and 12th centuries and is considered a high point of classical Asian art. It included sculptures and paintings. Islamic Bengal was noted for its production of the finest cotton fabrics and saris, notably the Jamdani, which received warrants from the Mughal court. The Bengal School of painting flourished in Kolkata and Shantiniketan in the British Raj during the early 20th century. Its practitioners were among the harbingers of modern painting in India. Zainul Abedin was the pioneer of modern Bangladeshi art. The country has a thriving and internationally acclaimed contemporary art scene. Architecture. Classical Bengali architecture features terracotta buildings. Ancient Bengali kingdoms laid the foundations of the region's architectural heritage through the construction of monasteries and temples (for example, the Somapura Mahavihara). During the sultanate period, a distinct and glorious Islamic style of architecture developed the region. Most Islamic buildings were small and highly artistic terracotta mosques with multiple domes and no minarets. Bengal was also home to the largest mosque in South Asia at Adina. Bengali vernacular architecture is credited for inspiring the popularity of the bungalow. The Bengal region also has a rich heritage of Indo-Saracenic architecture, including numerous zamindar palaces and mansions. The most prominent example of this style is the Victoria Memorial, Kolkata. In the 1950s, Muzharul Islam pioneered the modernist terracotta style of architecture in South Asia. This was followed by the design of the Jatiyo Sangshad Bhaban by the renowned American architect Louis Kahn in the 1960s, which was based on the aesthetic heritage of Bengali architecture and geography. Sciences. The Gupta dynasty, which is believed to have originated in North Bengal, pioneered the invention of chess, the concept of zero, the theory of Earth orbiting the Sun, the study of solar and lunar eclipses and the flourishing of Sanskrit literature and drama. The educational reforms during the British Raj gave birth to many distinguished scientists in Bengal. Sir Jagadish Chandra Bose pioneered the investigation of radio and microwave optics, made significant contributions to plant science, and laid the foundations of experimental science in the Indian subcontinent. IEEE named him one of the fathers of radio science. He was the first person from the Indian subcontinent to receive a US patent, in 1904. In 1924–25, while researching at the University of Dhaka, Satyendra Nath Bose well known for his works in quantum mechanics, provided the foundation for Bose–Einstein statistics and the theory of the Bose–Einstein condensate. Meghnad Saha was the first scientist to relate a star's spectrum to its temperature, developing thermal ionization equations (notably the Saha ionization equation) that have been foundational in the fields of astrophysics and astrochemistry. Amal Kumar Raychaudhuri was a physicist, known for his research in general relativity and cosmology. His most significant contribution is the eponymous Raychaudhuri equation, which demonstrates that singularities arise inevitably in general relativity and is a key ingredient in the proofs of the Penrose–Hawking singularity theorems. In the United States, the Bangladeshi-American engineer Fazlur Rahman Khan emerged as the "father of tubular designs" in skyscraper construction. Ashoke Sen is an Indian theoretical physicist whose main area of work is string theory. He was among the first recipients of the Fundamental Physics Prize "for opening the path to the realisation that all string theories are different limits of the same underlying theory". Music. The Baul tradition is a unique heritage of Bengali folk music. The 19th century mystic poet Lalon Shah is the most celebrated practitioner of the tradition. Other folk music forms include Gombhira, Bhatiali and Bhawaiya. Hason Raja is a renowned folk poet of the Sylhet region. Folk music in Bengal is often accompanied by the ektara, a one-stringed instrument. Other instruments include the dotara, dhol, flute, and tabla. The region also has a rich heritage in North Indian classical music. Cuisine. Bengali cuisine is the only traditionally developed multi-course tradition from the Indian subcontinent. Rice and fish are traditional favourite foods, leading to a saying that "fish and rice make a Bengali". Bengal's vast repertoire of fish-based dishes includes Hilsa preparations, a favourite among Bengalis. Bengalis make distinctive sweetmeats from milk products, including "Rôshogolla", "Chômchôm", and several kinds of "Pithe". The old city of Dhaka is noted for its distinct Indo-Islamic cuisine, including biryani, bakarkhani and kebab dishes. Boats. There are 150 types of Bengali country boats plying the 700 rivers of the Bengal delta, the vast floodplain and many oxbow lakes. They vary in design and size. The boats include the dinghy and sampan among others. Country boats are a central element of Bengali culture and have inspired generations of artists and poets, including the ivory artisans of the Mughal era. The country has a long shipbuilding tradition, dating back many centuries. Wooden boats are made of timber such as "Jarul" (dipterocarpus turbinatus)," sal" (shorea robusta), "sundari" (heritiera fomes), and "Burma teak" (tectons grandis). Medieval Bengal was shipbuilding hub for the Mughal and Ottoman navies. The British Royal Navy later utilised Bengali shipyards in the 19th century, including for the Battle of Trafalgar. Attire. Bengali women commonly wear the "shaŗi" , often distinctly designed according to local cultural customs. In urban areas, many women and men wear Western-style attire. Among men, European dressing has greater acceptance. Men also wear traditional costumes such as the "panjabi" with "dhoti" or "pyjama", often on religious occasions. The lungi, a kind of long skirt, is widely worn by Bangladeshi men. Festivals. For Bengali Muslims, the major religious festivals are Eid al-Fitr, Eid al-Adha, Mawlid, Muharram, and Shab-e-Barat. For Bengali Hindus, the major religious festivals include Durga Puja, Kali Puja, Janmashtami and Rath Yatra. In honour of Bengali Buddhists and Bengali Christians, both Buddha's Birthday and Christmas are public holidays in the region. The Bengali New Year is the main secular festival of Bengali culture celebrated by people regardless of religious and social backgrounds. The biggest congregation in Bengal is the Bishwa ijtema, which is also the world's second largest Islamic congregation. Other Bengali festivals include the first day of spring and the Nabanna harvest festival in autumn. Media. Bangladesh has a diverse, outspoken and privately owned press. English-language titles are popular in the urban readership. West Bengal had 559 published newspapers in 2005, of which 430 were in Bengali, with the largest circulated Bengali language newspapers and magazines in the world. Bengali cinema is divided between the media hubs of Dhaka and Kolkata. Sports. Cricket and football are popular sports in the Bengal region. Local games include sports such as Kho Kho and Kabaddi, the latter being the national sport of Bangladesh. An Indo-Bangladesh "Bengali Games" has been organised among the athletes of the Bengali speaking areas of the two countries.
4864
18010125
https://en.wikipedia.org/wiki?curid=4864
Bucket argument
Isaac Newton's rotating bucket argument (also known as Newton's bucket) is a thought experiment that was designed to demonstrate that true rotational motion cannot be defined as the relative rotation of the body with respect to the immediately surrounding bodies. It is one of five arguments from the "properties, causes, and effects" of "true motion and rest" that support his contention that, in general, true motion and rest cannot be defined as special instances of motion or rest relative to other bodies, but instead can be defined only by reference to absolute space. Alternatively, these experiments provide an operational definition of what is meant by "absolute rotation", and do not pretend to address the question of "rotation relative to "what"?" General relativity dispenses with absolute space and with physics whose cause is external to the system, with the concept of geodesics of spacetime. Background. These arguments, and a discussion of the distinctions between absolute and relative time, space, place and motion, appear in a scholium at the end of Definitions sections in Book I of Newton's work, "The Mathematical Principles of Natural Philosophy" (1687) (not to be confused with General Scholium at the end of Book III), which established the foundations of classical mechanics and introduced his law of universal gravitation, which yielded the first quantitatively adequate dynamical explanation of planetary motion. Despite their embrace of the principle of rectilinear inertia and the recognition of the kinematical relativity of apparent motion (which underlies whether the Ptolemaic or the Copernican system is correct), natural philosophers of the seventeenth century continued to consider true motion and rest as physically separate descriptors of an individual body. The dominant view Newton opposed was devised by René Descartes, and was supported (in part) by Gottfried Leibniz. It held that empty space is a metaphysical impossibility because space is nothing other than the extension of matter, or, in other words, that when one speaks of the space between things one is actually making reference to the relationship that exists between those things and not to some entity that stands between them. Concordant with the above understanding, any assertion about the motion of a body boils down to a description over time in which the body under consideration is at "t"1 found in the vicinity of one group of "landmark" bodies and at some "t"2 is found in the vicinity of some other "landmark" body or bodies. Descartes recognized that there would be a real difference, however, between a situation in which a body with movable parts and originally at rest with respect to a surrounding ring was itself accelerated to a certain angular velocity with respect to the ring, and another situation in which the surrounding ring were given a contrary acceleration with respect to the central object. With sole regard to the central object and the surrounding ring, the motions would be indistinguishable from each other assuming that both the central object and the surrounding ring were absolutely rigid objects. However, if neither the central object nor the surrounding ring were absolutely rigid then the parts of one or both of them would tend to fly out from the axis of rotation. For contingent reasons having to do with the Inquisition, Descartes spoke of motion as both absolute and relative. By the late 19th century, the contention that "all motion is relative" was re-introduced, notably by Ernst Mach (German 1883, English translation 1893). The argument. Newton discusses a bucket () filled with water hung by a cord. If the cord is twisted up tightly on itself and then the bucket is released, it begins to spin rapidly, not only with respect to the experimenter, but also in relation to the water it a situation which corresponds to diagram B above. Although the relative motion at this stage is the greatest, the surface of the water remains flat, indicating that the parts of the water have no tendency to recede from the axis of relative motion, despite proximity to the pail. Eventually, as the cord continues to unwind, the surface of the water assumes a concave shape as it acquires the motion of the bucket spinning relative to the experimenter. This concave shape shows that the water is rotating, despite the fact that the water is at rest relative to the pail. In other words, it is not the relative motion of the pail and water that causes concavity of the water, contrary to the idea that motions can only be relative, and that there is no absolute a situation which corresponds to diagram D above. Possibly the concavity of the water shows rotation relative to "something else": say absolute space? Newton says: "One can find out and measure the true and absolute circular motion of the water". In the 1846 Andrew Motte translation of Newton's words: The argument that the motion is absolute, not relative, is incomplete, as it limits the participants relevant to the experiment to only the pail and the water, a limitation that has not been established. In fact, the concavity of the water clearly involves gravitational attraction, and by implication the Earth also is a participant. Here is a critique due to Mach arguing that only relative motion is established: The degree in which Mach's hypothesis is integrated in general relativity is discussed in the article Mach's principle; it is generally held that general relativity is not entirely Machian. All observers agree that the surface of rotating water is curved. However, the explanation of this curvature involves centrifugal force for all observers with the exception of a truly stationary observer, who finds the curvature is consistent with the rate of rotation of the water as they observe it, with no need for an additional centrifugal force. Thus, a stationary frame can be identified, and it is not necessary to ask "Stationary with respect to what?": A supplementary thought experiment with the same objective of determining the occurrence of absolute rotation also was proposed by Newton: the example of observing two identical spheres in rotation about their center of gravity and tied together by a string. Occurrence of tension in the string is indicative of absolute rotation; see Rotating spheres. Detailed analysis. The historic interest of the rotating bucket experiment is its usefulness in suggesting one can detect absolute rotation by observation of the shape of the surface of the water. However, one might question just how rotation brings about this change. Below are two approaches to understanding the concavity of the surface of rotating water in a bucket. Newton's laws of motion. The shape of the surface of a rotating liquid in a bucket can be determined using Newton's laws for the various forces on an element of the surface. For example, see Knudsen and Hjorth. The analysis begins with the free body diagram in the co-rotating frame where the water appears stationary. The height of the water "h" = "h"("r") is a function of the radial distance "r" from the axis of rotation Ω, and the aim is to determine this function. An element of water volume on the surface is shown to be subject to three forces: the vertical force due to gravity Fg, the horizontal, radially outward centrifugal force FCfgl, and the force normal to the surface of the water Fn due to the rest of the water surrounding the selected element of surface. The force due to surrounding water is known to be normal to the surface of the water because a liquid in equilibrium cannot support shear stresses. To quote Anthony and Brackett: Moreover, because the element of water does not move, the sum of all three forces must be zero. To sum to zero, the force of the water must point oppositely to the sum of the centrifugal and gravity forces, which means the surface of the water must adjust so its normal points in this direction. (A very similar problem is the design of a , where the slope of the turn is set so a car will not slide off the road. The analogy in the case of rotating bucket is that the element of water surface will "slide" up or down the surface unless the normal to the surface aligns with the vector resultant formed by the vector addition Fg + FCfgl.) As "r" increases, the centrifugal force increases according to the relation (the equations are written per unit mass): formula_1 where "Ω" is the constant rate of rotation of the water. The gravitational force is unchanged at formula_2 where "g" is the acceleration due to gravity. These two forces add to make a resultant at an angle "φ" from the vertical given by formula_3 which clearly becomes larger as "r" increases. To ensure that this resultant is normal to the surface of the water, and therefore can be effectively nulled by the force of the water beneath, the normal to the surface must have the same angle, that is, formula_4 leading to the ordinary differential equation for the shape of the surface: formula_5 or, integrating: formula_6 where "h"(0) is the height of the water at "r" = 0. In other words, the surface of the water is parabolic in its dependence upon the radius. Potential energy. The shape of the water's surface can be found in a different, very intuitive way using the interesting idea of the potential energy associated with the centrifugal force in the co-rotating frame. In a reference frame uniformly rotating at angular rate Ω, the fictitious centrifugal force is conservative and has a potential energy of the form: formula_7 where "r" is the radius from the axis of rotation. This result can be verified by taking the gradient of the potential to obtain the radially outward force: formula_8 formula_9 The meaning of the potential energy (stored work) is that movement of a test body from a larger radius to a smaller radius involves doing work against the centrifugal force and thus gaining potential energy. But this test body at the smaller radius where its elevation is lower has now lost equivalent gravitational potential energy. Potential energy therefore explains the concavity of the water surface in a rotating bucket. Notice that at equilibrium the surface adopts a shape such that an element of volume at any location on its surface has the same potential energy as at any other. That being so, no element of water on the surface has any incentive to move position, because all positions are equivalent in energy. That is, equilibrium is attained. On the other hand, were surface regions with lower energy available, the water occupying surface locations of higher potential energy would move to occupy these positions of lower energy, inasmuch as there is no barrier to lateral movement in an ideal liquid. We might imagine deliberately upsetting this equilibrium situation by somehow momentarily altering the surface shape of the water to make it different from an equal-energy surface. This change in shape would not be stable, and the water would not stay in our artificially contrived shape, but engage in a transient exploration of many shapes until non-ideal frictional forces introduced by sloshing, either against the sides of the bucket or by the non-ideal nature of the liquid, killed the oscillations and the water settled down to the equilibrium shape. To see the principle of an equal-energy surface at work, imagine gradually increasing the rate of rotation of the bucket from zero. The water surface is flat at first, and clearly a surface of equal potential energy because all points on the surface are at the same height in the gravitational field acting upon the water. At some small angular rate of rotation, however, an element of surface water can achieve lower potential energy by moving outward under the influence of the centrifugal force; think of an object moving with the force of gravity closer to the Earth's center: the object lowers its potential energy by complying with a force. Because water is incompressible and must remain within the confines of the bucket, this outward movement increases the depth of water at the larger radius, increasing the height of the surface at larger radius, and lowering it at smaller radius. The surface of the water becomes slightly concave, with the consequence that the potential energy of the water at the greater radius is increased by the work done against gravity to achieve the greater height. As the height of water increases, movement toward the periphery becomes no longer advantageous, because the reduction in potential energy from working with the centrifugal force is balanced against the increase in energy working against gravity. Thus, at a given angular rate of rotation, a concave surface represents the stable situation, and the more rapid the rotation, the more concave this surface. If rotation is arrested, the energy stored in fashioning the concave surface must be dissipated, for example through friction, before an equilibrium flat surface is restored. To implement a surface of constant potential energy quantitatively, let the height of the water be formula_10: then the potential energy per unit mass contributed by gravity is formula_11 and the total potential energy per unit mass on the surface is formula_12 with formula_13 the background energy level independent of "r". In a static situation (no motion of the fluid in the rotating frame), this energy is constant independent of position "r". Requiring the energy to be constant, we obtain the parabolic form: formula_14 where "h"(0) is the height at "r" = 0 (the axis). See Figures 1 and 2. The principle of operation of the centrifuge also can be simply understood in terms of this expression for the potential energy, which shows that it is favorable energetically when the volume far from the axis of rotation is occupied by the heavier substance.
4865
1282315845
https://en.wikipedia.org/wiki?curid=4865
Roman Breviary
The Roman Breviary (Latin: "Breviarium Romanum") is a breviary of the Roman Rite in the Catholic Church. A liturgical book, it contains public or canonical prayers, hymns, the Psalms, readings, and notations for everyday use, especially by bishops, priests, and deacons in the Divine Office (i.e., at the canonical hours, the Christians' daily prayer). The volume containing the daily hours of Catholic prayer was published as the "Breviarium Romanum" (Roman Breviary) from its "editio princeps" in 1568 under Pope Pius V until the reforms of Paul VI (1974), when replaced by the Liturgy of the Hours. In the course of the Catholic Counter-Reformation, Pope Pius V (r. 1566–1572) imposed the use of the Roman Breviary, mainly based on the "Breviarium secundum usum Romanae Curiae", on the Latin Church of the Catholic Church. Exceptions are the Benedictines and Dominicans, who have breviaries of their own, and two surviving local use breviaries: Origin of name. The Latin word "breviarium" generally signifies "abridgement, compendium". This wider sense has often been used by Christian authors, e.g. "Breviarium fidei, Breviarium in psalmos, Breviarium canonum, Breviarium regularum". In liturgical language specifically, "breviary" ("breviarium") has a special meaning, indicating a book furnishing the regulations for the celebration of Mass or the canonical Office, and may be met with under the titles "Breviarium Ecclesiastici Ordinis", or "Breviarium Ecclesiæ Romanæ". In the 9th century, Alcuin uses the word to designate an office abridged or simplified for the use of the laity. Prudentius of Troyes, about the same period, composed a "Breviarium Psalterii". In an ancient inventory occurs "Breviarium Antiphonarii", meaning "Extracts from the Antiphonary". In the "Vita Aldrici" occurs "sicut in plenariis et breviariis Ecclesiæ ejusdem continentur". Again, in the inventories in the catalogues, such notes as these may be met with: "Sunt et duo cursinarii et tres benedictionales Libri; ex his unus habet obsequium mortuorum et unus Breviarius", or, "Præter Breviarium quoddam quod usque ad festivitatem S. Joannis Baptistæ retinebunt", etc. Monte Cassino in c. 1100 obtained a book titled "Incipit Breviarium sive Ordo Officiorum per totam anni decursionem". From such references, and from others of a like nature, Quesnel gathers that by the word "Breviarium" was at first designated a book furnishing the rubrics, a sort of Ordo. The usage of "breviary" to mean a book containing the entire canonical office appears to date from the 11th century. Pope Gregory VII (r. 1073–1085) having abridged the order of prayers, and having simplified the Liturgy as performed at the Roman Court, this abridgment received the name of Breviary, which was suitable, since, according to the etymology of the word, it was an abridgment. The name has been extended to books which contain in one volume, or at least in one work, liturgical books of different kinds, such as the Psalter, the Antiphonary, the Responsoriary, the Lectionary, etc. In this connection it may be pointed out that in this sense the word, as it is used nowadays, is illogical; it should be named a Plenarium rather than a Breviarium, since, liturgically speaking, the word Plenarium exactly designates such books as contain several different compilations united under one cover. History. Early history. The canonical hours of the Breviary owe their remote origin to the Old Covenant when God commanded the Aaronic priests to offer morning and evening sacrifices. Other inspiration may have come from David's words in the Psalms "Seven times a day I praise you" (Ps. 119:164), as well as, "the just man meditates on the law day and night" (Ps. 1:2). Regarding Daniel "Three times daily he was kneeling and offering prayers and thanks to his God" (Dan. 6:10). In the early days of Christian worship the Sacred Scriptures furnished all that was thought necessary, containing as it did the books from which the lessons were read and the psalms that were recited. The first step in the evolution of the Breviary was the separation of the Psalter into a choir-book. At first the president of the local church (bishop) or the leader of the choir chose a particular psalm as he thought appropriate. From about the 4th century certain psalms began to be grouped together, a process that was furthered by the monastic practice of daily reciting the 150 psalms. This took so much time that the monks began to spread it over a week, dividing each day into hours, and allotting to each hour its portion of the Psalter. St Benedict in the 6th century drew up such an arrangement, probably, though not certainly, on the basis of an older Roman division which, though not so skilful, is the one in general use. Gradually there were added to these psalter choir-books additions in the form of antiphons, responses, collects or short prayers, for the use of those not skilful at improvisation and metrical compositions. Jean Beleth, a 12th-century liturgical author, gives the following list of books necessary for the right conduct of the canonical office: the Antiphonarium, the Old and New Testaments, the "Passionarius (liber)" and the "Legendarius" (dealing respectively with martyrs and saints), the "Homiliarius" (homilies on the Gospels), the "Sermologus" (collection of sermons) and the works of the Fathers, besides the "Psalterium" and the "Collectarium". To overcome the inconvenience of using such a library the Breviary came into existence and use. Already in the 9th century Prudentius, bishop of Troyes, had in a "Breviarium Psalterii" made an abridgment of the Psalter for the laity, giving a few psalms for each day, and Alcuin had rendered a similar service by including a prayer for each day and some other prayers, but no lessons or homilies. Medieval breviaries. The Breviary proper only dates from the 11th century; the earliest manuscript containing the whole canonical office is of the year 1099, and is in the Mazarin library. Gregory VII (pope 1073–1085), too, simplified the liturgy as performed at the Roman court, and gave his abridgment the name of Breviary, which thus came to denote a work which from another point of view might be called a Plenary, involving as it did the collection of several works into one. There are several extant specimens of 12th-century Breviaries, all Benedictine, but under Innocent III (pope 1198–1216) their use was extended, especially by the newly founded and active Franciscan order. These preaching friars, with the authorization of Gregory IX, adopted (with some modifications, e.g. the substitution of the "Gallican" for the "Roman" version of the Psalter) the Breviary hitherto used exclusively by the Roman court, and with it gradually swept out of Europe all the earlier partial books (Legendaries, Responsories), etc., and to some extent the local Breviaries, like that of Sarum. Finally, Nicholas III (pope 1277–1280) adopted this version both for the curia and for the basilicas of Rome, and thus made its position secure. Before the rise of the mendicant orders (wandering friars) in the 13th century, the daily services were usually contained in a number of large volumes. The first occurrence of a single manuscript of the daily office was written by the Benedictine order at Monte Cassino in Italy in 1099. The Benedictines were not a mendicant order, but a stable, monastery-based order, and single-volume breviaries are rare from this early period. The arrangement of the Psalms in the Rule of St. Benedict had a profound impact upon the breviaries used by secular and monastic clergy alike, until 1911 when Pope Pius X introduced his reform of the Roman Breviary. In many places, every diocese, order or ecclesiastical province maintained its own edition of the breviary. However, mendicant friars travelled frequently and needed a shortened, or abbreviated, daily office contained in one portable book, and single-volume breviaries flourished from the thirteenth century onwards. These abbreviated volumes soon became very popular and eventually supplanted the Catholic Church's Curia office, previously said by non-monastic clergy. Early printed editions. Before the advent of printing, breviaries were written by hand and were often richly decorated with initials and miniature illustrations telling stories in the lives of Christ or the saints, or stories from the Bible. Later printed breviaries usually have woodcut illustrations, interesting in their own right but with poor relation to the beautifully illuminated breviaries. The beauty and value of many of the Latin Breviaries were brought to the notice of English churchmen by one of the numbers of the Oxford "Tracts for the Times", since which time they have been much more studied, both for their own sake and for the light they throw upon the English Prayer-Book. Early printed Breviaries were locally distributed and quickly worn out by daily use. As a result, surviving copies are rare; of those editions which survive at all, many are known only by a single copy. In Scotland the only one which has survived the convulsions of the 16th century is "Aberdeen Breviary", a Scottish form of the Sarum Office (the Sarum Rite was much favoured in Scotland as a kind of protest against the jurisdiction claimed by the diocese of York), revised by William Elphinstone (bishop 1483–1514), and printed at Edinburgh by Walter Chapman and Androw Myllar in 1509–1510. Four copies have been preserved of it, of which only one is complete; but it was reprinted in facsimile in 1854 for the Bannatyne Club by the munificence of the Duke of Buccleuch. It is particularly valuable for the trustworthy notices of the early history of Scotland which are embedded in the lives of the national saints. Though enjoined by royal mandate in 1501 for general use within the realm of Scotland, it was probably never widely adopted. The new Scottish "Proprium" sanctioned for the Catholic province of St Andrews in 1903 contains many of the old Aberdeen collects and antiphons. The Sarum or Salisbury Breviary itself was very widely used. The first edition was printed at Venice in 1483 by Raynald de Novimagio in folio; the latest at Paris, 1556, 1557. While modern Breviaries are nearly always printed in four volumes, one for each season of the year, the editions of the Sarum never exceeded two parts. Early modern reforms. Until the Council of Trent (1545–1563) and the Catholic Counter-Reformation, every bishop had full power to regulate the Breviary of his own diocese; and this was acted upon almost everywhere. Each monastic community, also, had one of its own. Pope Pius V (r. 1566–1572), however, while sanctioning those which could show at least 200 years of existence, made the Roman obligatory in all other places. But the influence of the Roman rite has gradually gone much beyond this, and has superseded almost all the local uses. The Roman has thus become nearly universal, with the allowance only of additional offices for saints specially venerated in each particular diocese. The Roman Breviary has undergone several revisions: The most remarkable of these is that by Francis Quignonez, cardinal of Santa Croce in Gerusalemme (1536), which, though not accepted by Rome (it was approved by Clement VII and Paul III, and permitted as a substitute for the unrevised Breviary, until Pius V in 1568 excluded it as too short and too modern, and issued a reformed edition of the old Breviary, the "Breviarium Pianum" or "Pian Breviary"), formed the model for the still more thorough reform made in 1549 by the Church of England, whose daily morning and evening services are but a condensation and simplification of the Breviary offices. Some parts of the prefaces at the beginning of the English Prayer-Book are free translations of those of Quignonez. The Pian Breviary was again altered by Sixtus V in 1588, who introduced the revised Vulgate, in 1602 by Clement VIII (through Baronius and Bellarmine), especially as concerns the rubrics, and by Urban VIII (1623–1644), a purist who altered the text of certain hymns. In the 17th and 18th centuries a movement of revision took place in France, and succeeded in modifying about half the Breviaries of that country. Historically, this proceeded from the labours of Jean de Launoy (1603–1678), "le dénicheur des saints", and Louis Sébastien le Nain de Tillemont, who had shown the falsity of numerous lives of the saints; theologically it was produced by the Port Royal school, which led men to dwell more on communion with God as contrasted with the invocation of the saints. This was mainly carried out by the adoption of a rule that all antiphons and responses should be in the exact words of Scripture, which cut out the whole class of appeals to created beings. The services were at the same time simplified and shortened, and the use of the whole Psalter every week (which had become a mere theory in the Roman Breviary, owing to its frequent supersession by saints' day services) was made a reality. These reformed French Breviaries—e.g. the Paris Breviary of 1680 by Archbishop François de Harlay (1625–1695) and that of 1736 by Archbishop Charles-Gaspard-Guillaume de Vintimille du Luc (1655–1746)—show a deep knowledge of Holy Scripture, and much careful adaptation of different texts. Later modern reforms. During the pontificate of Pius IX, a strong Ultramontane movement arose against the French breviaries of 1680 and 1736. This was inaugurated by Montalembert, but its literary advocates were chiefly Prosper Guéranger, abbot of the Benedictine monastery Solesmes, and Louis Veuillot (1813–1883) of the "Univers". The movement succeeded in suppressing the breviaries, the last diocese to surrender being Orleans in 1875. The Jansenist and Gallican influence was also strongly felt in Italy and in Germany, where breviaries based on the French models were published at Cologne, Münster, Mainz and other towns. Meanwhile, under the direction of Benedict XIV (pope 1740–1758), a special congregation collected much material for an official revision, but nothing was published. In 1902, under Leo XIII, a commission under the presidency of Louis Duchesne was appointed to consider the breviary, the missal, the Roman Pontifical and the Roman Ritual. Significant changes came in 1910 with the reform of the Roman Breviary by Pope Pius X. This revision modified the traditional psalm scheme so that, while all 150 psalms were used in the course of the week, these were said without repetition. Those assigned to the Sunday office underwent the least revision, although noticeably fewer psalms are recited at Matins, and both Lauds and Compline are slightly shorter due to psalms (or in the case of Compline the first few verses of a psalm) being removed. Pius X was probably influenced by earlier attempts to eliminate repetition in the psalter, most notably the liturgy of the Benedictine congregation of St. Maur. However, since Cardinal Quignonez's attempt to reform the Breviary employed this principle—albeit with no regard to the traditional scheme—such notions had floated around in the western Church, and can particularly be seen in the Paris Breviary. Pope Pius XII introduced optional use of a new translation of the Psalms from the Hebrew to a more classical Latin. Most breviaries published in the late 1950s and early 1960s used this "Pian Psalter". Pope John XXIII also revised the Breviary in 1960, introducing changes drawn up by his predecessor Pope Pius XII. The most notable alteration is the shortening of most feasts from nine to three lessons at Matins, keeping only the Scripture readings (the former lesson i, then lessons ii and iii together), followed by either the first part of the patristic reading (lesson vii) or, for most feasts, a condensed version of the former second Nocturn, which was formerly used when a feast was reduced in rank and commemorated. Abrogation and subsequent reauthorization. The Second Vatican Council, in his Constitution "Sacrosanctum Concilium", asked the Pope for a comprehensive reform of the Hours. As a result, in 1970 the Breviary was replaced by the "Liturgy of the Hours", which is divided into six different volumes: Advent, Christmas, Lent and Easter and two for the Ordinary Time; the new Hours were promulgated by Pope Paul VI in his apostolic constitution "Laudis canticum". In his apostolic letter "Summorum Pontificum", Pope Benedict XVI allowed clerics to fulfill their obligation of prayer using the 1962 edition of the Roman Breviary in lieu of the Liturgy of the Hours. Contents of the Roman Breviary. At the beginning stands the usual introductory matter, such as the tables for determining the date of Easter, the calendar, and the general rubrics. The Breviary itself is divided into four seasonal parts—winter, spring, summer, autumn—and comprises under each part: These parts are often published separately. The Psalter. This psalm book is the very backbone of the Breviary, the groundwork of the Catholic prayer-book; out of it have grown the antiphons, responsories and versicles. Until the 1911 reform, the psalms were arranged according to a disposition dating from the 8th century, as follows: Psalms 1–108, with some omissions, were recited at Matins, twelve each day from Monday to Saturday, and eighteen on Sunday. The omissions were said at Lauds, Prime and Compline. Psalms 109-147 (except 117, 118, and 142) were said at Vespers, five each day. Psalms 148-150 were always used at Lauds, and give that hour its name. The text of this Psalter is that commonly known as the Gallican. The name is misleading, for it is simply the second revision (A.D. 392) made by Jerome of the old "Itala" version originally used in Rome. Jerome's first revision of the "Itala" (A.D. 383), known as the Roman, is still used at St Peter's in Rome, but the "Gallican", thanks especially to St Gregory of Tours, who introduced it into Gaul in the 6th century, has ousted it everywhere else. The Antiphonary of Bangor proves that Ireland accepted the Gallican version in the 7th century, and the English Church did so in the 10th. Following the 1911 reform, Matins was reduced to nine Psalms every day, with the other psalms redistributed throughout Prime, Terce, Sext, and Compline. For Sundays and special feasts Lauds and Vespers largely remained the same, Psalm 118 remained distributed at the Little Hours and Psalms 4, 90, and 130 were kept at Compline. The "Proprium de Tempore". This contains the office of the seasons of the Christian year (Advent to Trinity), a conception that only gradually grew up. There is here given the whole service for every Sunday and weekday, the proper antiphons, responsories, hymns, and especially the course of daily Scripture reading, averaging about twenty verses a day, and (roughly) arranged thus: The "Proprium Sanctorum". This contains the lessons, psalms and liturgical formularies for saints' festivals, and depends on the days of the secular month. The readings of the second Nocturn are mainly hagiological biography, with homilies or papal documents for certain major feasts, particularly those of Jesus and Mary. Some of this material has been revised by Leo XIII, in view of archaeological and other discoveries. The third Nocturn consists of a homily on the Gospel which is read at that day's Mass. Covering a great stretch of time and space, they do for the worshipper in the field of church history what the Scripture readings do in that of biblical history. The "Commune Sanctorum". This comprises psalms, antiphons, lessons, &c., for feasts of various groups or classes (twelve in all); e.g. apostles, martyrs, confessors, virgins, and the Blessed Virgin Mary. These offices are of very ancient date, and many of them were probably in origin proper to individual saints. They contain passages of great literary beauty. The lessons read at the third nocturn are patristic homilies on the Gospels, and together form a rough summary of theological instruction. Extra services. Here are found the Little Office of the Blessed Virgin Mary, the Office for the Dead (obligatory on All Souls' Day), and offices peculiar to each diocese. Elements of the Hours. It has already been indicated, by reference to Matins, Lauds, &c., that not only each day, but each part of the day, has its own office, the day being divided into liturgical "hours." A detailed account of these will be found in the article Canonical Hours. Each of the hours of the office is composed of the same elements, and something must be said now of the nature of these constituent parts, of which mention has here and there been already made. They are: psalms (including canticles), antiphons, responsories, hymns, lessons, little chapters, versicles and collects. Psalms. Before the 1911 reform, the multiplication of saints' festivals, with practically the same festal psalms, tended to repeat the about one-third of the Psalter, with a correspondingly rare recital of the remaining two-thirds. Following this reform, the entire Psalter is again generally recited each week, with the festal psalms restricted to only the highest-ranking feasts. As in the Greek usage and in the Benedictine, certain canticles like the Song of Moses (Exodus xv.), the Song of Hannah (1 Sam. ii.), the prayer of Habakkuk (iii.), the prayer of Hezekiah (Isaiah xxxviii.) and other similar Old Testament passages, and, from the New Testament, the Magnificat, the Benedictus and the Nunc dimittis, are admitted as psalms. Antiphons. The antiphons are short liturgical forms, sometimes of biblical, sometimes of patristic origin, used to introduce a psalm. The term originally signified a chant by alternate choirs, but has quite lost this meaning in the Breviary. Responsories. The responsories are similar in form to the antiphons, but come at the end of the psalm, being originally the reply of the choir or congregation to the precentor who recited the psalm. Hymns. The hymns are short poems going back in part to the days of Prudentius, Synesius, Gregory of Nazianzus and Ambrose (4th and 5th centuries), but mainly the work of medieval authors. Lessons. The lessons, as has been seen, are drawn variously from the Bible, the Acts of the Saints and the Fathers of the Church. In the primitive church, books afterwards excluded from the canon were often read, e.g. the letters of Clement of Rome and the Shepherd of Hermas. In later days the churches of Africa, having rich memorials of martyrdom, used them to supplement the reading of Scripture. Monastic influence accounts for the practice of adding to the reading of a biblical passage some patristic commentary or exposition. Books of homilies were compiled from the writings of SS. Augustine, Hilary, Athanasius, Isidore, Gregory the Great and others, and formed part of the library of which the Breviary was the ultimate compendium. In the lessons, as in the psalms, the order for special days breaks in upon the normal order of ferial offices and dislocates the scheme for consecutive reading. The lessons are read at Matins (which is subdivided into three nocturns). Little chapters. The little chapters are very short lessons read at the other "hours." Versicles. The versicles are short responsories used after the little chapters in the minor hours. They appear after the hymns in Lauds and Vespers. Collects. The collects come at the close of the office and are short prayers summing up the supplications of the congregation. They arise out of a primitive practice on the part of the bishop (local president), examples of which are found in the Didachē (Teaching of the Apostles) and in the letters of Clement of Rome and Cyprian. With the crystallization of church order, improvisation in prayer largely gave place to set forms, and collections of prayers were made which later developed into Sacramentaries and Orationals. The collects of the Breviary are largely drawn from the Gelasian and other Sacramentaries, and they are used to sum up the dominant idea of the festival in connection with which they happen to be used. Celebration. Before 1910, the difficulty of harmonizing the "Proprium de Tempore" and the "Proprium Sanctorum", to which reference has been made, was only partly met in the thirty-seven chapters of general rubrics. Additional help was given by a kind of Catholic Churchman's Almanack, called the "Ordo Recitandi Divini Officii", published in different countries and dioceses, and giving, under every day, minute directions for proper reading. In 1960, John XXIII simplified the rubrics governing the Breviary in order to make it easier to use. Every cleric in Holy Orders, and many other members of religious orders, must publicly join in or privately read aloud (i.e. using the lips as well as the eyes—it takes about two hours in this way) the whole of the Breviary services allotted for each day. In large churches where they were celebrated the services were usually grouped; e.g. Matins and Lauds (about 7.30 A.M.); Prime, Terce (High Mass), Sext, and None (about 10 A.M.); Vespers and Compline (4 P.M.); and from four to eight hours (depending on the amount of music and the number of high masses) are thus spent in choir. Lay use of the Breviary has varied throughout the Church's history. In some periods laymen did not use the Breviary as a manual of devotion to any great extent. The late Medieval period saw the recitation of certain hours of the Little Office of the Blessed Virgin, which was based on the Breviary in form and content, becoming popular among those who could read, and Bishop Challoner did much to popularise the hours of Sunday Vespers and Compline (albeit in English translation) in his "Garden of the Soul" in the eighteenth century. The Liturgical Movement in the twentieth century saw renewed interest in the Offices of the Breviary and several popular editions were produced, containing the vernacular as well as the Latin. The complete pre-Pius X Roman Breviary was translated into English (by the Marquess of Bute in 1879; new ed. with a trans, of the Martyrology, 1908), French and German. Bute's version is noteworthy for its inclusion of the skilful renderings of the ancient hymns by J.H. Newman, J.M. Neale and others. Several editions of the Pius X Breviary were produced during the twentieth century, including a notable edition prepared with the assistance of the sisters of Stanbrook Abbey in the 1950s. Two editions in English and Latin were produced in the following decade, which conformed to the rubrics of 1960, published by Liturgical Press and Benziger in the United States. These used the Pius XII psalter. Baronius Press's revised edition of the Liturgical Press edition uses the older Gallican psalter of St. Jerome. This edition was published and released in 2012 for pre-orders only. In 2013, the publication has resumed printing and is available on Baronius' website. Under Pope Benedict XVI's motu proprio Summorum Pontificum, Catholic bishops, priests, and deacons are again permitted to use the 1961 edition of the Roman Breviary, promulgated by Pope John XXIII to satisfy their obligation to recite the Divine Office every day. Online resources. In 2008, a website containing the Divine Office (both Ordinary and Extraordinary) in various languages, "i-breviary", was launched, which combines the modern and ancient breviaries with the latest computer technology.
4868
27015025
https://en.wikipedia.org/wiki?curid=4868
B. F. Skinner
Burrhus Frederic Skinner (March 20, 1904 – August 18, 1990) was an American psychologist, behaviorist, inventor, and social philosopher. He was the Edgar Pierce Professor of Psychology at Harvard University from 1948 until his retirement in 1974. Skinner developed behavior analysis, especially the philosophy of radical behaviorism, and founded the experimental analysis of behavior, a school of experimental research psychology. He also used operant conditioning to strengthen behavior, considering the rate of response to be the most effective measure of response strength. To study operant conditioning, he invented the operant conditioning chamber (aka the Skinner box), and to measure rate he invented the cumulative recorder. Using these tools, he and Charles Ferster produced Skinner's most influential experimental work, outlined in their 1957 book "Schedules of Reinforcement". Skinner was a prolific author, publishing 21 books and 180 articles. He imagined the application of his ideas to the design of a human community in his 1948 utopian novel, "Walden Two", while his analysis of human behavior culminated in his 1958 work, "Verbal Behavior". Skinner, John B. Watson and Ivan Pavlov, are considered to be the pioneers of modern behaviorism. Accordingly, a June 2002 survey listed Skinner as the most influential psychologist of the 20th century. Early life. Skinner was born in Susquehanna, Pennsylvania, to Grace and William Skinner, the latter of whom was a lawyer. Skinner became an atheist after a Christian teacher tried to assuage his fear of the hell that his grandmother described. His brother Edward, two and a half years younger, died at age 16 of a cerebral hemorrhage. Skinner's closest friend as a young boy was Raphael Miller, whom he called Doc because his father was a doctor. Doc and Skinner became friends due to their parents' religiousness and both had an interest in contraptions and gadgets. They had set up a telegraph line between their houses to send messages to each other, although they had to call each other on the telephone due to the confusing messages sent back and forth. During one summer, Doc and Skinner started an elderberry business to gather berries and sell them door to door. They found that when they picked the ripe berries, the unripe ones came off the branches too, so they built a device that was able to separate them. The device was a bent piece of metal to form a trough. They would pour water down the trough into a bucket, and the ripe berries would sink into the bucket and the unripe ones would be pushed over the edge to be thrown away. Education. Skinner attended Hamilton College in Clinton, New York, with the intention of becoming a writer. He found himself at a social disadvantage at the college because of his intellectual attitude. He was a member of Lambda Chi Alpha fraternity. He wrote for the school paper, but, as an atheist, he was critical of the traditional mores of his college. After receiving his Bachelor of Arts in English literature in 1926, he attended Harvard University, where he would later research and teach. While attending Harvard, a fellow student, Fred S. Keller, convinced Skinner that he could make an experimental science of the study of behavior. This led Skinner to invent a prototype for the Skinner box and to join Keller in the creation of other tools for small experiments. After graduation, Skinner unsuccessfully tried to write a novel while he lived with his parents, a period that he later called the "Dark Years". He became disillusioned with his literary skills despite encouragement from the poet Robert Frost, concluding that he had little world experience and no strong personal perspective from which to write. His encounter with John B. Watson's "behaviorism" led him into graduate study in psychology and to the development of his own version of behaviorism. Career. Skinner received a PhD from Harvard in 1931, and remained there as a researcher for some years. In 1936, he went to the University of Minnesota in Minneapolis to teach. In 1945, he moved to Indiana University, where he was chair of the psychology department from 1946 to 1947, before returning to Harvard as a tenured professor in 1948. He remained at Harvard for the rest of his life. In 1973, Skinner was one of the signers of the "Humanist Manifesto II". Personal life. In 1936, Skinner married Yvonne "Eve" Blue. The couple had two daughters, Julie (later Vargas) and Deborah (later Buzan; married Barry Buzan). Yvonne died in 1997, and is buried in Mount Auburn Cemetery, Cambridge, Massachusetts. Death. Skinner's public exposure had increased in the 1970s, he remained active even after his retirement in 1974, until his death. In 1989, Skinner was diagnosed with leukemia and died on August 18, 1990, in Cambridge, Massachusetts. Ten days before his death, he was given the lifetime achievement award by the American Psychological Association and gave a talk concerning his work. Contributions to psychology. Behaviorism. Skinner referred to his approach to the study of behavior as "radical behaviorism", which originated in the early 1900s as a reaction to depth psychology and other traditional forms of psychology, which often had difficulty making predictions that could be tested experimentally. This philosophy of behavioral science assumes that behavior is a consequence of environmental histories of reinforcement (see applied behavior analysis). In his words: Foundations of Skinner's behaviorism. Skinner's ideas about behaviorism were largely set forth in his first book, "The Behavior of Organisms" (1938). Here, he gives a systematic description of the manner in which environmental variables control behavior. He distinguished two sorts of behavior which are controlled in different ways: Both of these sorts of behavior had already been studied experimentally, most notably: respondents, by Ivan Pavlov; and operants, by Edward Thorndike. Skinner's account differed in some ways from earlier ones, and was one of the first accounts to bring them under one roof. The idea that behavior is strengthened or weakened by its consequences raises several questions. Among the most commonly asked are these: 1. Origin of operant behavior. Skinner's answer to the first question was very much like Darwin's answer to the question of the origin of a 'new' bodily structure, namely, variation and selection. Similarly, the behavior of an individual varies from moment to moment; a variation that is followed by reinforcement is strengthened and becomes prominent in that individual's behavioral repertoire. "Shaping" was Skinner's term for the gradual modification of behavior by the reinforcement of desired variations. Skinner believed that 'superstitious' behavior can arise when a response happens to be followed by reinforcement to which it is actually unrelated. 2. Control of operant behavior. The second question, "how is operant behavior controlled?" arises because, to begin with, the behavior is "emitted" without reference to any particular stimulus. Skinner answered this question by saying that a stimulus comes to control an operant if it is present when the response is reinforced and absent when it is not. For example, if lever-pressing only brings food when a light is on, a rat, or a child, will learn to press the lever only when the light is on. Skinner summarized this relationship by saying that a discriminative stimulus (e.g. light or sound) sets the occasion for the reinforcement (food) of the operant (lever-press). This three-term contingency (stimulus-response-reinforcer) is one of Skinner's most important concepts, and sets his theory apart from theories that use only pair-wise associations. 3. Explaining complex behavior. Most behavior of humans cannot easily be described in terms of individual responses reinforced one by one, and Skinner devoted a great deal of effort to the problem of behavioral complexity. Some complex behavior can be seen as a sequence of relatively simple responses, and here Skinner invoked the idea of "chaining". Chaining is based on the fact, experimentally demonstrated, that a discriminative stimulus not only sets the occasion for subsequent behavior, but it can also reinforce a behavior that precedes it. That is, a discriminative stimulus is also a "conditioned reinforcer". For example, the light that sets the occasion for lever pressing may also be used to reinforce "turning around" in the presence of a noise. This results in the sequence "noise – turn-around – light – press lever – food." Much longer chains can be built by adding more stimuli and responses. However, Skinner recognized that a great deal of behavior, especially human behavior, cannot be accounted for by gradual shaping or the construction of response sequences. Complex behavior often appears suddenly in its final form, as when a person first finds his way to the elevator by following instructions given at the front desk. To account for such behavior, Skinner introduced the concept of rule-governed behavior. First, relatively simple behaviors come under the control of verbal stimuli: the child learns to "jump," "open the book," and so on. After a large number of responses come under such verbal control, a sequence of verbal stimuli can evoke an almost unlimited variety of complex responses. Reinforcement. Reinforcement, a key concept of behaviorism, is the primary process that shapes and controls behavior, and occurs in two ways: "positive" and "negative". In "The Behavior of Organisms" (1938), Skinner defines "negative reinforcement" to be synonymous with "punishment", i.e. the presentation of an aversive stimulus. This definition would subsequently be re-defined in "Science and Human Behavior" (1953). In what has now become the standard set of definitions, "positive" reinforcement is the strengthening of behavior by the occurrence of some event (e.g., praise after some behavior is performed), whereas "negative" reinforcement is the strengthening of behavior by the removal or avoidance of some aversive event (e.g., opening and raising an umbrella over your head on a rainy day is reinforced by the cessation of rain falling on you). Both types of reinforcement strengthen behavior, or increase the probability of a behavior reoccurring; the difference being in whether the reinforcing event is something applied (positive reinforcement) or something removed or avoided (negative reinforcement). Punishment can be the "application" of an aversive stimulus/event (positive punishment or punishment by contingent stimulation) or the "removal" of a desirable stimulus (negative punishment or punishment by contingent withdrawal). Though punishment is often used to suppress behavior, Skinner argued that this suppression is temporary and has a number of other, often unwanted, consequences. Extinction is the absence of a rewarding stimulus, which weakens behavior. Writing in 1981, Skinner pointed out that Darwinian natural selection is, like reinforced behavior, "selection by consequences". Though, as he said, natural selection has now "made its case," he regretted that essentially the same process, "reinforcement", was less widely accepted as underlying human behavior. Schedules of reinforcement. Skinner recognized that behavior is typically reinforced more than once, and, together with Charles Ferster, he did an extensive analysis of the various ways in which reinforcements could be arranged over time, calling it the "schedules of reinforcement". The most notable schedules of reinforcement studied by Skinner were continuous, interval (fixed or variable), and ratio (fixed or variable). All are methods used in operant conditioning. Token economy. "Skinnerian" principles have been used to create token economies in a number of institutions, such as psychiatric hospitals. When participants behave in desirable ways, their behavior is reinforced with tokens that can be changed for such items as candy, cigarettes, coffee, or the exclusive use of a radio or television set. "Verbal Behavior". Challenged by Alfred North Whitehead during a casual discussion while at Harvard to provide an account of a randomly provided piece of verbal behavior, Skinner set about attempting to extend his then-new functional, inductive approach to the complexity of human verbal behavior. Developed over two decades, his work appeared in the book "Verbal Behavior". Although Noam Chomsky was highly critical of "Verbal Behavior", he conceded that Skinner's "S-R psychology" was worth a review. Behavior analysts reject Chomsky's appraisal of Skinner's work as merely "stimulus-response psychology," and some have argued that this mischaracterization highlights a poor understanding of Skinner's work and the field of behavior analysis as a whole. "Verbal Behavior" had an uncharacteristically cool reception, partly as a result of Chomsky's review, partly because of Skinner's failure to address or rebut any of Chomsky's criticisms. Skinner's peers may have been slow to adopt the ideas presented in "Verbal Behavior" because of the absence of experimental evidence—unlike the empirical density that marked Skinner's experimental work. Scientific inventions. Operant conditioning chamber. An operant conditioning chamber (also known as a "Skinner box") is a laboratory apparatus used in the experimental analysis of animal behavior. It was invented by Skinner while he was a graduate student at Harvard University. As used by Skinner, the box had a lever (for rats), or a disk in one wall (for pigeons). A press on this "manipulandum" could deliver food to the animal through an opening in the wall, and responses reinforced in this way increased in frequency. By controlling this reinforcement together with discriminative stimuli such as lights and tones, or punishments such as electric shocks, experimenters have used the operant box to study a wide variety of topics, including schedules of reinforcement, discriminative control, delayed response ("memory"), punishment, and so on. By channeling research in these directions, the operant conditioning chamber has had a huge influence on course of research in animal learning and its applications. It enabled great progress on problems that could be studied by measuring the rate, probability, or force of a simple, repeatable response. However, it discouraged the study of behavioral processes not easily conceptualized in such terms—spatial learning, in particular, which is now studied in quite different ways, for example, by the use of the water maze. Cumulative recorder. The cumulative recorder makes a pen-and-ink record of simple repeated responses. Skinner designed it for use with the operant chamber as a convenient way to record and view the rate of responses such as a lever press or a key peck. In this device, a sheet of paper gradually unrolls over a cylinder. Each response steps a small pen across the paper, starting at one edge; when the pen reaches the other edge, it quickly resets to the initial side. The slope of the resulting ink line graphically displays the rate of the response; for example, rapid responses yield a steeply sloping line on the paper, slow responding yields a line of low slope. The cumulative recorder was a key tool used by Skinner in his analysis of behavior, and it was very widely adopted by other experimenters, gradually falling out of use with the advent of the laboratory computer and use of line graphs. Skinner's major experimental exploration of response rates, presented in his book with Charles Ferster, "Schedules of Reinforcement", is full of cumulative records produced by this device. Air crib. The air crib is an easily cleaned, temperature- and humidity-controlled box-bed intended to replace the standard infant crib. After raising one baby, Skinner felt that he could simplify the process for parents and improve the experience for children. He primarily thought of the idea to help his wife cope with the day-to-day tasks of child rearing. Skinner had some specific concerns about raising a baby in the rough environment where he lived in Minnesota. Keeping the child warm was a central priority (Faye, 2010). Though this was the main goal, it also was designed to reduce laundry, diaper rash, and cradle cap, while still allowing the baby to be more mobile and comfortable. Reportedly it had some success in these goals as it was advertised commercially with an estimate of 300 children who were raised in the air crib. "Psychology Today" tracked down 50 children and ran a short piece on the effects of the air crib. The reports came back positive and that these children and parents enjoyed using the crib (Epstein, 2005). One of these air cribs resides in the gallery at the Center for the History of Psychology in Akron, Ohio (Faye, 2010). The air crib was designed with three solid walls and a safety-glass panel at the front which could be lowered to move the baby in and out of the crib. The floor was stretched canvas. Sheets were intended to be used over the canvas and were easily rolled off when soiled. Addressing Skinners' concern for temperature, a control box on top of the crib regulated temperature and humidity. Filtered air flowed through the crib from below. This crib was higher than most standard cribs, allowing easier access to the child without the need to bend over (Faye, 2010). The air crib was a controversial invention. It was popularly characterized as a cruel pen, and it was often compared to Skinner's operant conditioning chamber (or "Skinner box"). Skinner's article in "Ladies Home Journal", titled "Baby in a Box", caught the eye of many and contributed to skepticism about the device (Bjork, 1997). A picture published with the article showed the Skinners' daughter, Deborah, peering out of the crib with her hands and face pressed upon the glass. Skinner also used the term "experiment" when describing the crib, and this association with laboratory animal experimentation discouraged the crib's commercial success, although several companies attempted to produce and sell it. In 2004, therapist Lauren Slater repeated a claim that Skinner may have used his baby daughter in some of his experiments. His outraged daughter publicly accused Slater of not making a good-faith effort to check her facts before publishing. Debora was quoted by the Guardian saying "According to "Opening Skinner's Box: Great Psychological Experiments of the Twentieth Century", my father, who was a psychologist based at Harvard from the 1950s to the 90s, "used his infant daughter, Deborah, to prove his theories by putting her for a few hours a day in a laboratory box . . . in which all her needs were controlled and shaped". But it's not true. My father did nothing of the sort." Teaching machine. The teaching machine was a mechanical device whose purpose was to administer a curriculum of programmed learning. The machine embodies key elements of Skinner's theory of learning and had important implications for education in general and classroom instruction in particular. In one incarnation, the machine was a box that housed a list of questions that could be viewed one at a time through a small window. (see picture.) There was also a mechanism through which the learner could respond to each question. Upon delivering a correct answer, the learner would be rewarded. Skinner advocated the use of teaching machines for a broad range of students (e.g., preschool aged to adult) and instructional purposes (e.g., reading and music). For example, one machine that he envisioned could teach rhythm. He wrote:The instructional potential of the teaching machine stemmed from several factors: it provided automatic, immediate and regular reinforcement without the use of aversive control; the material presented was coherent, yet varied and novel; the pace of learning could be adjusted to suit the individual. As a result, students were interested, attentive, and learned efficiently by producing the desired behavior, "learning by doing." Teaching machines, though perhaps rudimentary, were not rigid instruments of instruction. They could be adjusted and improved based upon the students' performance. For example, if a student made many incorrect responses, the machine could be reprogrammed to provide less advanced prompts or questions—the idea being that students acquire behaviors most efficiently if they make few errors. Multiple-choice formats were not well-suited for teaching machines because they tended to increase student mistakes, and the contingencies of reinforcement were relatively uncontrolled. Not only useful in teaching explicit skills, machines could also promote the development of a repertoire of behaviors that Skinner called self-management. Effective self-management means attending to stimuli appropriate to a task, avoiding distractions, reducing the opportunity of reward for competing behaviors, and so on. For example, machines encourage students to pay attention before receiving a reward. Skinner contrasted this with the common classroom practice of initially capturing students' attention (e.g., with a lively video) and delivering a reward (e.g., entertainment) before the students have actually performed any relevant behavior. This practice fails to reinforce correct behavior and actually counters the development of self-management. Skinner pioneered the use of teaching machines in the classroom, especially at the primary level. Today computers run software that performs similar teaching tasks, and there has been a resurgence of interest in the topic related to the development of adaptive learning systems. Pigeon-guided missile. During World War II, the US Navy required a weapon effective against surface ships, such as the German "Bismarck" class battleships. Although missile and TV technology existed, the size of the primitive guidance systems available rendered automatic guidance impractical. To solve this problem, Skinner initiated Project Pigeon, which was intended to provide a simple and effective guidance system. Skinner trained pigeons through operant conditioning to peck a camera obscura screen showing incoming targets on individual screens (Schultz-Figueroa, 2019). This system divided the nose cone of a missile into three compartments, with a pigeon placed in each. Within the ship, the three lenses projected an image of distant objects onto a screen in front of each bird. Thus, when the missile was launched from an aircraft within sight of an enemy ship, an image of the ship would appear on the screen. The screen was hinged, which connected the screens to the bomb's guidance system. This was done through four small rubber pneumatic tubes that were attached to each side of the frame, which directed a constant airflow to a pneumatic pickup system that controlled the thrusters of the bomb. Resulting in the missile being guided towards the targeted ship, through just the peck coming from the pigeon (Schultz-Figueroa, 2019). Despite an effective demonstration, the project was abandoned, and eventually more conventional solutions, such as those based on radar, became available. Skinner complained that "our problem was no one would take us seriously." Before the project was completely abandoned it was tested extensively in the laboratory. After the United States Army ultimately denied it the United States Naval Research Laboratory picked up Skinner's Research and renamed it Project ORCON, which was a contraction of "organic" and "control". Skinner worked closely with the US Naval Research Laboratory continuously testing the pigeon's tracking capacity for guiding missiles to their intended targets. In the end, the pigeons' performance and accuracy relied on so many uncontrollable factors that Project ORCON, like Project Pigeon before it, was again discontinued. It was never used in the field. Verbal summator. Early in his career Skinner became interested in "latent speech" and experimented with a device he called the "verbal summator". This device can be thought of as an auditory version of the Rorschach inkblots. When using the device, human participants listened to incomprehensible auditory "garbage" but often read meaning into what they heard. Thus, as with the Rorschach blots, the device was intended to yield overt behavior that projected subconscious thoughts. Skinner's interest in projective testing was brief, but he later used observations with the summator in creating his theory of verbal behavior. The device also led other researchers to invent new tests such as the tautophone test, the auditory apperception test, and the Azzageddi test. Influence on teaching. Along with psychology, education has also been influenced by Skinner's views, which are extensively presented in his book "The Technology of Teaching", as well as reflected in Fred S. Keller's "Personalized System of Instruction" and Ogden R. Lindsley's "Precision Teaching". Skinner argued that education has two major purposes: He recommended bringing students' behavior under appropriate control by providing reinforcement only in the presence of stimuli relevant to the learning task. Because he believed that human behavior can be affected by small consequences, something as simple as "the opportunity to move forward after completing one stage of an activity" can be an effective reinforcer. Skinner was convinced that, to learn, a student must engage in behavior, and not just passively receive information. Skinner believed that effective teaching must be based on positive reinforcement which is, he argued, more effective at changing and establishing behavior than punishment. He suggested that the main thing people learn from being punished is how to avoid punishment. For example, if a child is forced to practice playing an instrument, the child comes to associate practicing with punishment and thus develops feelings of dreadfulness and wishes to avoid practicing the instrument. This view had obvious implications for the then widespread practice of rote learning and punitive discipline in education. The use of educational activities as punishment may induce rebellious behavior such as vandalism or absence. Because teachers are primarily responsible for modifying student behavior, Skinner argued that teachers must learn effective ways of teaching. In "The Technology of Teaching" (1968), Skinner has a chapter on why teachers fail: He says that teachers have not been given an in-depth understanding of teaching and learning. Without knowing the science underpinning teaching, teachers fall back on procedures that work poorly or not at all, such as: Skinner suggests that any age-appropriate skill can be taught. The steps are Contributions to social theory. Skinner is popularly known mainly for his books "Walden Two" (1948) and "Beyond Freedom and Dignity," (for which he made the cover of "Time" magazine). The former describes a fictional "experimental community" in 1940s United States. The productivity and happiness of citizens in this community is far greater than in the outside world because the residents practice scientific social planning and use operant conditioning in raising their children. "Walden Two", like Thoreau's "Walden", champions a lifestyle that does not support war, or foster competition and social strife. It encourages a lifestyle of minimal consumption, rich social relationships, personal happiness, satisfying work, and leisure. In 1967, Kat Kinkade and others founded the Twin Oaks Community, using" Walden Two" as a blueprint. The community still exists and continues to use the Planner-Manager system and other aspects of the community described in Skinner's book, though behavior modification is not a community practice. In "Beyond Freedom and Dignity", Skinner suggests that a technology of behavior could help to make a better society. We would, however, have to accept that an autonomous agent is not the driving force of our actions. Skinner offers alternatives to punishment, and challenges his readers to use science and modern technology to construct a better society. Political views. Skinner's political writings emphasized his hopes that an effective and human science of behavioral control – a technology of human behavior – could help with problems as yet unsolved and often aggravated by advances in technology such as the atomic bomb. Indeed, one of Skinner's goals was to prevent humanity from destroying itself. He saw political activity as the use of aversive or non-aversive means to control a population. Skinner favored the use of positive reinforcement as a means of control, citing Jean-Jacques Rousseau's novel "" as an example of literature that "did not fear the power of positive reinforcement." Skinner's book, "Walden Two", presents a vision of a decentralized, localized society, which applies a practical, scientific approach and behavioral expertise to deal peacefully with social problems. (For example, his views led him to oppose corporal punishment in schools, and he wrote a letter to the California Senate that helped lead it to a ban on spanking.) Skinner's utopia is both a thought experiment and a rhetorical piece. In "Walden Two", Skinner answers the problem that exists in many utopian novels – "What is the Good Life?" The book's answer is a life of friendship, health, art, a healthy balance between work and leisure, a minimum of unpleasantness, and a feeling that one has made worthwhile contributions to a society in which resources are ensured, in part, by minimizing consumption. Skinner described his novel as "my New Atlantis", in reference to Bacon's utopia. Superstition' in the Pigeon" experiment. One of Skinner's experiments examined the formation of superstition in one of his favorite experimental animals, the pigeon. Skinner placed a series of hungry pigeons in a cage attached to an automatic mechanism that delivered food to the pigeon "at regular intervals with no reference whatsoever to the bird's behavior." He discovered that the pigeons associated the delivery of the food with whatever chance actions they had been performing as it was delivered, and that they subsequently continued to perform these same actions.Skinner suggested that the pigeons behaved as if they were influencing the automatic mechanism with their "rituals", and that this experiment shed light on human behavior:Modern behavioral psychologists have disputed Skinner's "superstition" explanation for the behaviors he recorded. Subsequent research (e.g. Staddon and Simmelhag, 1971), while finding similar behavior, failed to find support for Skinner's "adventitious reinforcement" explanation for it. By looking at the timing of different behaviors within the interval, Staddon and Simmelhag were able to distinguish two classes of behavior: the "terminal response", which occurred in anticipation of food, and "interim responses", that occurred earlier in the interfood interval and were rarely contiguous with food. Terminal responses seem to reflect classical (as opposed to operant) conditioning, rather than adventitious reinforcement, guided by a process like that observed in 1968 by Brown and Jenkins in their "autoshaping" procedures. The causation of interim activities (such as the schedule-induced polydipsia seen in a similar situation with rats) also cannot be traced to adventitious reinforcement and its details are still obscure (Staddon, 1977). Criticism. Noam Chomsky. American linguist Noam Chomsky published a review of Skinner's "Verbal Behavior" in the linguistics journal "Language" in 1959. Chomsky argued that Skinner's attempt to use behaviorism to explain human language amounted to little more than word games. Conditioned responses could not account for a child's ability to create or understand an infinite variety of novel sentences. Chomsky's review has been credited with launching the cognitive revolution in psychology and other disciplines. Skinner, who rarely responded directly to critics, never formally replied to Chomsky's critique, but endorsed Kenneth MacCorquodale's 1972 reply. Many academics in the 1960s believed that Skinner's silence on the question meant Chomsky's criticism had been justified. But MacCorquodale wrote that Chomsky's criticism did not focus on Skinner's "Verbal Behavior", but rather attacked a confusion of ideas from behavioral psychology. MacCorquodale also regretted Chomsky's aggressive tone. Furthermore, Chomsky had aimed at delivering a definitive refutation of Skinner by citing dozens of animal instinct and animal learning studies. On the one hand, he argued that the studies on animal instinct proved that animal behavior is innate, and therefore Skinner was mistaken. On the other, Chomsky's opinion of the studies on learning was that one cannot draw an analogy from animal studies to human behavior—or, that research on animal instinct refutes research on animal learning. Chomsky also reviewed Skinner's "Beyond Freedom and Dignity", using the same basic motives as his "Verbal Behavior" review. Among Chomsky's criticisms were that Skinner's laboratory work could not be extended to humans, that when it was extended to humans it represented "scientistic" behavior attempting to emulate science but which was not scientific, that Skinner was not a scientist because he rejected the hypothetico-deductive model of theory testing, and that Skinner had no science of behavior. Psychodynamic psychology. Skinner has been repeatedly criticized for his supposed animosity towards Sigmund Freud, psychoanalysis, and psychodynamic psychology. Some have argued, however, that Skinner shared several of Freud's assumptions, and that he was influenced by Freudian points of view in more than one field, among them the analysis of defense mechanisms, such as repression. To study such phenomena, Skinner even designed his own projective test, the "verbal summator" described above. J. E. R. Staddon. As understood by Skinner, ascribing "dignity" to individuals involves giving them credit for their actions. To say "Skinner is brilliant" means that Skinner is an originating force. If Skinner's determinist theory is right, he is merely the focus of his environment. He is not an originating force and he had no choice in saying the things he said or doing the things he did. Skinner's environment and genetics both allowed and compelled him to write his book. Similarly, the environment and genetic potentials of the advocates of freedom and dignity cause them to resist the reality that their own activities are deterministically grounded. J. E. R. Staddon has argued the compatibilist position; Skinner's determinism is not in any way contradictory to traditional notions of reward and punishment, as he believed. Temple Grandin. In her 2005 book, "Animals in Translation", animal behaviorist Temple Grandin claimed that Skinner attempted to touch her legs without consent during a meeting and that she verbally rebuffed him. She recalled that she was approximately 18 years old at the time. She repeated this claim in a 2006 interview with NPR and a 2018 interview with the Center for Autism and Related Disorders (although she stated in the 2006 interview that Skinner touched her prior to being rebuffed and in the 2018 interview that he asked if he could touch her prior to being rebuffed). In the book and both interviews, Grandin also claimed Skinner was dismissive of her suggestion that a better understanding of how the brain functions would lead to a better understanding of behavior, but that he reversed this position following a stroke later in life. Professional career. Honorary degrees. Skinner received honorary degrees from: Honorary societies. Skinner was inducted to the following honorary societies:
4870
43767702
https://en.wikipedia.org/wiki?curid=4870
Bill Macy
Wolf Martin Garber (May 18, 1922 – October 17, 2019), known professionally as Bill Macy, was an American television, film and stage actor known for his role in the CBS television series "Maude" (1972–1978). Early life. Bill Macy was born Wolf Martin Garber on May 18, 1922, in Revere, Massachusetts, the son of Mollie (née Friedopfer; 1889–1986) and Michael Garber (1884–1974), a manufacturer. He was raised Jewish in the East Flatbush section of New York, New York. After graduating from Samuel J. Tilden High School he served in the United States Army from 1942 to 1946 with the 594th Engineer Boat and Shore Regiment, stationed in the Philippines, New Guinea and Japan. He worked as a cab driver for a decade before being cast as Walter Matthau's understudy in "Once More, with Feeling" on Broadway in 1958. He portrayed a cab driver on the soap opera "The Edge of Night" in 1966. Macy was an original cast member of the 1969–1972 Off-Broadway sensation "Oh! Calcutta!", performing in the show from 1969 to 1971. He later appeared in the 1972 movie version of the musical. Of appearing fully nude with the rest of the cast in the stage show, he said, "The nudity didn't bother me. I'm from Brooklyn." Macy performed on the P. D. Q. Bach album "The Stoned Guest" (1970). Television. Appreciating Macy's comedic skills off Broadway, Norman Lear brought him to Hollywood, where he first got a small part as a police officer in "All in the Family." He was cast in the role of Walter Findlay, the husband of the title character on the 1970s television sitcom "Maude", starring Bea Arthur. The show ran for six seasons from 1972 to 1978. Strangers on the street often called him "Mr. Maude", consoling him for having such a difficult wife. "I used to tell them that people like that really existed," Macy explained. In 1975, Macy and Samantha Harper Macy appeared on the game show "Tattletales". In 1986, Macy was a guest on the fourth episode of "L.A. Law", playing an older man whose young wife wants a music career. Also that year he guest starred in an episode of Highway to Heaven, called Cindy. Macy appeared in the television movie "Perry Mason: The Case of the Murdered Madam" (1987) as banker Richard Wilson. He appeared on "The Facts of Life" in a 1988 episode. He occasionally appeared on "Seinfeld" as one of the residents of the Florida retirement community where Jerry Seinfeld's parents lived. Macy made a guest appearance as a patient on "Chicago Hope" and as an aging gambler on the series "Las Vegas". Macy's last television role occurred in a 2010 episode of Jada Pinkett Smith's series "Hawthorne". Film. Macy appeared as the jury foreman in "The Producers" in 1967, with the memorable sole line "We find the defendants incredibly guilty". Other memorable roles include the co-inventor of the "Opti-Grab" in the 1979 Steve Martin comedy "The Jerk" and as the head television writer in "My Favourite Year" (1982). Other film credits included roles in "Death at Love House" (1976), "The Late Show" (1977), "Serial" (1980), "Movers & Shakers" (1985), "Bad Medicine" (1985), "Tales from the Darkside" (1985 - "Lifebomb" episode), "Sibling Rivalry" (1990), "The Doctor" (1991), "Me Myself & I" (1992), "Analyze This" (1999), "Surviving Christmas" (2004), "The Holiday" (2006), and "Mr. Woodcock" (2007). Personal life. Macy met his future wife, Samantha Harper, on the set of "Oh! Calcutta!" in 1969. They married in 1975. Macy died in Los Angeles on October 17, 2019, at the age of 97; no cause was given.
4871
1261736
https://en.wikipedia.org/wiki?curid=4871
Bob Knight
Robert Montgomery Knight (October 25, 1940 – November 1, 2023) was an American men's college basketball coach. Nicknamed "the General", he won 902 NCAA Division I men's basketball games, a record at the time of his retirement and sixth all-time record at the time of his death. Knight was the head coach of the Army Black Knights (1965–1971), the Indiana Hoosiers (1971–2000), and the Texas Tech Red Raiders (2001–2008). While at Army, he led the Black Knights to four post-season tournament appearances in six seasons, winning two-thirds of his games along the way. After taking the job at Indiana, his teams won three NCAA championships, one National Invitation Tournament (NIT) championship, and 11 Big Ten Conference championships. His 1975–76 team won the 1976 NCAA tournament, and is the last men's team in Division I college basketball to go undefeated during an entire season (32–0). They remain, as of the end of the 2024–25 season, the last team to be undefeated national champions. In the seven full seasons that he coached at Texas Tech, his teams qualified for a post-season tournament five times. He retired partway through the 2007–08 season and was replaced by his son Pat Knight at Texas Tech. He later worked as a men's college basketball studio analyst at ESPN. Knight sparked controversy with his outspoken nature and his volatility. He once threw a chair across the court during a game and was once arrested following a physical confrontation with a police officer. He was accused of choking an Indiana player during practice in an incident that was recorded on video, prompting the university to institute a "zero tolerance" policy for him. Following a subsequent run-in with a student, he was fired by Indiana University in the fall of 2000. Knight was one of college basketball's most successful and innovative coaches, having popularized the motion offense. He received national coach of the year honors four times and Big Ten Coach of the Year honors eight times. He was also successful on the international stage, winning gold medals at both the 1979 Pan American Games and the 1984 Summer Olympics with the U.S. men's national team. He is one of only three basketball coaches to win an NCAA title, an NIT title, and an Olympic gold medal. He was inducted into the Naismith Memorial Basketball Hall of Fame in 1991. Early life and college career. Knight was born on October 25, 1940, in the town of Massillon, Ohio, and grew up in Orrville, Ohio. His father Pat worked for the railroad and his mother Hazel was a school teacher. He began playing organized basketball at Orrville High School. Knight continued at Ohio State in 1958 when he played for Basketball Hall of Fame coach Fred Taylor. Despite being a star player in high school, he played a reserve role as a forward on the 1960 Ohio State Buckeyes team that won the NCAA championship and featured future Hall of Fame players John Havlicek and Jerry Lucas. Knight was also a member of the 1961 and 1962 Buckeyes teams that lost in the finals to the Cincinnati Bearcats. Due in part to the star power of those Ohio State teams, Knight usually received scant playing time, but that did not prevent him from making an impact. In the 1961 NCAA championship game, Knight came off the bench with 1:41 on the clock and Cincinnati leading Ohio State, 61–59. In the words of then–Ohio State assistant coach Frank Truitt: Knight got the ball in the left front court and faked a drive into the middle. Then [he] crossed over like he worked on it all his life and drove right in and laid it up. That tied the game for us, and Knight ran clear across the floor like a 100-yard dash sprinter and ran right at me and said, "See there, coach, I should have been in that game a long time ago!" To which Truitt replied, "Sit down, you hot dog. You're lucky you're even on the floor." In addition to lettering in basketball at Ohio State, it has been claimed that Knight also lettered in football and baseball; however, the official list of Ohio State football letter earners does not include Knight. Knight graduated with a degree in history and government in 1962. After graduating from Ohio State University in 1962, he coached junior varsity basketball at Cuyahoga Falls High School in Ohio for one year. Knight then enlisted in the U.S. Army and served on active duty from June 1963 to June 1965 and in the U.S. Army Reserves from June 1965 to May 1969. He conducted initial training at Fort Leonard Wood, Missouri and was transferred to West Point, New York, in September 1963. He became a private first class. Coaching career. Army. While in the army, he accepted an assistant coaching position with the Army Black Knights in 1963, where, two years later, he was named head coach at the relatively young age of 24. In six seasons as a head coach at West Point, Knight won 102 games, with his first coming against Worcester Polytechnic Institute. He led Army to four NITs, advancing to the semifinals three times. One of his players was Mike Krzyzewski, who later served as his assistant before becoming a Hall of Fame head coach at Duke. Mike Silliman was another of Knight's players at Army, and Knight was quoted as saying that Silliman was the best player that he had coached. During his tenure at Army, Knight gained a reputation for having an explosive temper. After Army's 66–60 loss to BYU and Hall of Fame coach Stan Watts in the semifinals of the 1966 NIT, Knight completely lost control, kicking lockers and verbally blasting the officials. Embarrassed, he later went to Watts' hotel room and apologized. Watts forgave him, and is quoted as saying, "I want you to know that you're going to be one of the bright young coaches in the country, and it's just a matter of time before you win a national championship." Knight was one of seven candidates vying to fill the Wisconsin men's basketball head coaching vacancy after John Erickson resigned to become the Milwaukee Bucks' first-ever general manager on April 3, 1968. Knight was offered the position but requested more time to think it over. By the time he had returned to West Point, news that he was to become the Badgers' new coach was prematurely leaked to the local media. After consulting with Bo Schembechler, who had also had a negative experience as a Wisconsin football coaching candidate the previous year, Knight withdrew his candidacy and continued to coach at Army for three more seasons. Erickson's assistant coach John Powless was promoted instead. Indiana. In 1971, Indiana University Bloomington hired Knight as head coach. During his 29 years at the school, the Hoosiers won 662 games, including 22 seasons of 20 or more wins, while losing 239, a .735 winning percentage. In 24 NCAA tournament appearances at Indiana, Hoosier teams under Knight won 42 of 63 games (.667), winning titles in 1976, 1981, and 1987, while losing in the semifinals in 1973 and 1992. 1970s. In 1972–73, Knight's second year as coach, Indiana won the Big Ten championship and reached the Final Four, losing to UCLA, which was on its way to its seventh consecutive national title. The following season, in 1973–74, Indiana once again captured a Big Ten title. In the two following seasons, 1974–75 and 1975–76, the Hoosiers were undefeated in the regular season and won 37 consecutive Big Ten games, including two more Big Ten championships. In 1974–75, the Hoosiers swept the entire Big Ten by an average of 22.8 points per game. However, in an 83–82 win against Purdue they lost consensus All-American forward Scott May to a broken left arm. With May's injury limiting him to seven minutes of play, the No. 1 Hoosiers lost to Kentucky 92–90 in the Mideast Regional. Despite the loss, the Hoosiers were so dominant that four starters—Scott May, Steve Green, Kent Benson, and Quinn Buckner—would make the five-man All-Big Ten team. The following season, in 1975–76, the Hoosiers went the entire season and 1976 NCAA tournament without a single loss, defeating Michigan 86–68 in the title game. Immediately after the game, Knight lamented that "it should have been two." The 1976 Hoosiers remain the last undefeated NCAA Division I men's basketball team. Through these two seasons, Knight's teams were undefeated in the regular season, including a perfect 37–0 record in Big Ten games on their way to their third and fourth conference titles in a row. Behind the play of Mike Woodson, Indiana won the 1979 NIT championship. Throughout the 1970s, however, Knight was beginning to be involved in several controversies. 1960 Olympic gold medalist Douglas Blubaugh was head wrestling coach at IU from 1973 to 1984. Early in his tenure while he jogged in the practice facility during basketball practice, Knight yelled at him to leave, using more than one expletive. Blubaugh pinned Knight to a wall, and told him never to repeat the performance, and Knight never did. On December 7, 1974, Indiana defeated Kentucky 98–74. Near the end of the game, Knight went to the Kentucky bench where the official was standing to complain about a call. Before he left, Knight hit Kentucky coach Joe B. Hall in the back of the head. Kentucky assistant coach Lynn Nance, a former FBI agent, had to be restrained by Hall from hitting Knight. Hall later said, "It publicly humiliated me." Knight blamed the furor on Hall, stating, "If it was meant to be malicious, I'd have blasted the fucker into the seats." Years after the incident, it was reported that Knight choked and punched Indiana University's longtime sports information director, Kit Klingelhoffer, over a news release that upset the coach. In 1976, Knight grabbed IU basketball player Jim Wisman and jerked him into his seat. 1980s. The 1979–80 Hoosiers, led by Mike Woodson and Isiah Thomas, won the Big Ten championship and advanced to the 1980 Sweet Sixteen. The following season, in 1980–81, Thomas and the Hoosiers once again won a conference title and won the 1981 NCAA tournament, Knight's second national title. In 1982–83, with the strong play of Uwe Blab and All-Americans Ted Kitchel and Randy Wittman, the No. 1 ranked Hoosiers were favorites to win another national championship. However, with an injury to Kitchel mid-season, the Hoosiers lost to Kentucky in the 1983 Sweet Sixteen. In the 1985–86 season, the Hoosiers were profiled in a bestselling book "A Season on the Brink". To write it, Knight granted author John Feinstein almost unprecedented access to the Indiana basketball program, as well as insights into Knight's private life. The following season, in 1986–87, the Hoosiers were led by All-American Steve Alford and captured a share of the Big Ten title. The team won Knight's third national championship (the school's fifth) against Syracuse in the 1987 NCAA tournament with a game-winning jump shot by Keith Smart with five seconds remaining in the championship game. In the 1988–89 season, the Hoosiers were led by All-American Jay Edwards and won a Big Ten championship. Knight was involved in several controversies in the 1980s as well. In a game between Indiana and Purdue in Bloomington on January 31, 1981, Isiah Thomas allegedly hit Purdue guard Roosevelt Barnes in what some critics described as a "sucker punch". Video replay later shown by Knight showed Barnes had thrown the first punch, and that Thomas was merely reacting to this. When the two schools played their second game of the season at Purdue on February 7, 1981, Knight claimed a number of derisive chants were directed at him, his wife, and Indiana University. In response, Knight invited Purdue athletic director George King on his weekly television show to discuss the matter, but King declined. Therefore, in place of King, Knight brought onto the show a "jackass" (male donkey) wearing a Purdue hat as a representative of Purdue. On February 23, 1985, during a Purdue–Indiana game in Bloomington, five minutes into the game a scramble for a loose ball resulted in a foul call on Indiana's Marty Simmons. Immediately after the resumption of play, a foul was called on Indiana's Daryl Thomas. Knight, irate, insisted the first of the two calls should have been for a jump ball and ultimately received a technical foul. Purdue's Steve Reid stepped to the free throw line to shoot the resulting free throws, but before he could, Knight grabbed a red plastic chair from Indiana's bench and threw it across the floor toward the basket in front of Reid. Knight was charged with a second and third technical foul and was ejected from the game. He apologized for his actions the next day and was given a one-game suspension and two years' probation from the Big Ten. In later years, Knight would occasionally joke about the chair-throwing incident by saying that he saw an old lady standing on the opposite sideline and threw her the chair so she could sit down. Former Indiana basketball player Todd Jadlow has written a book alleging that from 1985 to 1989, Knight punched him in the face, broke a clipboard over the top of his head, and squeezed his testicles and the testicles of other Hoosiers, among other abuses. In an April 1988 interview with Connie Chung, when discussing an Indiana basketball game in which he felt the referees were making poor calls against the Hoosiers, Knight said, "I think that if rape is inevitable, relax and enjoy it." In response, women's groups nationwide were outraged by Knight's comments. 1990–2000. From 1990–91 through 1992–93, the Hoosiers posted 87 victories, the most by any Big Ten team in a three-year span, breaking the mark of 86 set by Knight's Indiana teams of 1974–76. They captured two Big Ten crowns in 1990–91 and 1992–93, and during the 1991–92 season reached the Final Four. During the 1992–93 season, the 31–4 Hoosiers finished the season at the top of the AP Poll, but were defeated by Kansas in the Elite Eight. Players from the team in this era included Greg Graham, Pat Knight, All-Americans Damon Bailey and Alan Henderson, Brian Evans, and National Player of the Year Calbert Cheaney. Throughout the mid and late 1990s Knight continued to experience success with continual NCAA tournament appearances and a minimum of 19 wins each season. However, 1993 would be Knight's last conference championship and 1994 would be his last trip to the Sweet Sixteen. Throughout the 1990s Knight was yet again involved in several controversies: Dismissal from Indiana. On March 14, 2000, (just before Indiana was to begin play in the NCAA tournament), the CNN Sports Illustrated network ran a piece on Robert Abbott's investigation of Knight in which former player Neil Reed claimed he had been choked by Knight during a practice in 1997. Knight denied the claims in the story. However, less than a month later, the network aired a tape of an Indiana practice from 1997 that appeared to show Knight placing his hand on the neck of Reed. In response, Indiana University president Myles Brand announced that he had adopted a "zero tolerance" policy with regard to Knight's behavior. Later in the year, in September 2000, Indiana freshman Kent Harvey (not a basketball player) reportedly said, "Hey, Knight, what's up?" to Knight. According to Harvey, Knight then grabbed him by the arm and lectured him for not showing him respect, insisting that Harvey address him as either "Mr. Knight" or "Coach Knight" instead of simply "Knight." Brand stated that this incident was only one of numerous complaints that occurred after the zero-tolerance policy had been put into place. Brand asked Knight to resign on September 10, and when Knight refused, Brand relieved him of his coaching duties effective immediately. Knight's dismissal was met with outrage from students. That night, thousands of Indiana students marched from Indiana University's Assembly Hall to Brand's home, burning Brand in effigy. Harvey was supported by some and vilified by many who claim he had intentionally set up Knight. Kent Harvey's stepfather, Mark Shaw, was a former Bloomington-area radio talk show host and Knight critic. On September 13, Knight said goodbye to a crowd of some 6,000 supporters in Dunn Meadow at Indiana University. He asked that they not hold a grudge against Harvey and that they continue to support the basketball team. Knight's firing made national headlines, including the cover of "Sports Illustrated" and around-the-clock coverage on ESPN, as well as mentions on CNN and CBS. Two days after Knight was fired from Indiana University, Jeremy Schaap of ESPN interviewed him and discussed his time at Indiana. Towards the end of the interview, Knight talked about his son, Pat, who had also been dismissed by the university, wanting an opportunity to be a head coach. Schaap, thinking that Knight was finished, attempted to move on to another subject, but Knight insisted on continuing about his son. Schaap repeatedly tried to ask another question when Knight shifted the conversation to Schaap's style of interviewing, notably chastising him about interruptions. Knight then commented (referring to Schaap's father, Dick Schaap), "You've got a long way to go to be as good as your dad." In a March 2017 interview on "The Dan Patrick Show", Knight stated that he had no interest in ever returning to Indiana. When host Dan Patrick commented that most of the administration that had fired Knight seventeen years earlier were no longer there, Knight said, "I hope they're all dead." Knight ultimately returned to Assembly Hall at halftime of Indiana's game against Purdue on February 8, 2020, and received a rousing standing ovation. It was the first Indiana game attended by Knight since his dismissal by the school 20 years prior. Texas Tech. Following his dismissal from Indiana, Knight took a season off while on the lookout for coaching vacancies. He accepted the head coaching position at Texas Tech University, although his hiring was opposed by a faculty group led by Walter Schaller, associate professor of philosophy. When he was introduced at the press conference, Knight quipped, "This is without question the most comfortable red sweater I've had on in six years." Knight quickly improved the program, which had not been to an NCAA tournament since 1996. He led the Red Raiders to postseason appearances in each of his first four years at the school (three NCAA Championship tournaments and one NIT). After a rough 2006 season, the team improved in 2007, finishing 21–13 and again making it to the NCAA tournament, where it lost to Boston College in the first round. The best performance by the Red Raiders under Knight came in 2005 when they advanced as far as the Sweet Sixteen. In both 2006 and 2007 under Knight, Texas Tech defeated two Top 10-ranked teams in consecutive weeks. During Knight's first six years at Texas Tech, the Red Raiders won 126 games. During Knight's coaching at Texas Tech, Knight was also involved in several controversies. In March 2006, a student's heckling at Baylor University resulted in Knight having to be restrained by a police officer. The incident was not severe enough to warrant any action from the Big 12 Conference. On November 13, 2006, Knight was shown allegedly hitting player Michael Prince under the chin to get him to make eye contact. Although Knight did not comment on the incident afterwards, Prince, his parents, and Texas Tech athletic director Gerald Myers insisted that Knight did nothing wrong and that he merely lifted Prince's chin and told him, "Hold your head up and don't worry about mistakes. Just play the game." Prince commented, "He was trying to teach me and I had my head down so he raised my chin up. He was telling me to go out there and don't be afraid to make mistakes. He said I was being too hard on myself." ESPN analyst Fran Fraschilla defended Knight by saying "That's coaching!" On October 21, 2007, James Simpson of Lubbock, Texas, accused Knight of firing a shotgun in his direction after he yelled at Knight and another man for hunting too close to his home. Knight denied the allegations; however, an argument between the two men was recorded via camera phone and aired later on television. Knight won his 900th game in his coaching career on January 16, 2008, in a 68–58 win against Texas A&M, but not before arguing with referees during the match. Retirement. On February 4, 2008, Knight announced his retirement. His son Pat Knight, the head coach designate since 2005, was immediately named as his successor at Texas Tech. The younger Knight had said that after many years of coaching, his father was exhausted and ready to retire. Just after achieving his 900th win, Knight handed the job over to Pat in the mid-season in part to allow him to get acquainted with coaching the team earlier, instead of having him wait until October, the start of the next season. Knight continued to live in Lubbock after he retired. United States national team. 1979 Pan American Games. In 1978, Knight was named the head coach of the United States men's national team for the 1979 Pan American Games in San Juan, Puerto Rico. The team, which included players such as Isiah Thomas and Ralph Sampson, trained together for more than 50 days and played in a tournament in Italy before arriving in Puerto Rico. During the games, Knight led the United States to a 9–0 record, with an average victory of 21.2 points, and gold medal. However, his behaviour during the games, where he feuded with referees, officials and made critical comments about Puerto Rico, was heavily criticized, including by the president of the Basketball Federation of Puerto Rico, Arturo C. Gallardo, in a lengthy article in the New York Times. During the first game, with the United States leading by 35 points, he was ejected for arguing with referees and in another incident during a practice session, Knight was accused of assaulting the policeman guarding the gymnasium and was arrested. Both Knight and assistant coach Mike Krzyzewski refuted the policeman's version of the incident, with Krzyzewski stating "It's really unbelievable, the out-and-out lies that are being told. It's like my standing here and saying that my name is not Mike Krzyzewski, that it's Fred Taylor." Knight was later charged with assault and summoned to appear before a judge but left the island before trial was held and refused to return with Indiana officials further rejecting Puerto Rican's extradition requests. He was later tried in absentia, found guilty and sentenced to a six-month prison term and a 500 dollar fine. Following a United States Supreme Court ruling in 1987 that overturned a law which gave state governors the power to reject extradition requests and opened up the possibility of his extradition to Puerto Rico, Knight wrote a letter to the Puerto Rico Olympic Committee, Germán Rieckehoff, apologizing for the incident. Rieckehoff "urged the Commonwealth not to consider any further legal action against Knight". 1984 Summer Olympics. Despite the controversies, Knight was selected in 1982 to coach the U.S. national team at the 1984 Summer Olympic Games. He held a 72 player tryout camp in April 1984 before settling on the 12 man roster which included Michael Jordan, Patrick Ewing, Chris Mullin and Knight's Indiana player and protégé Steve Alford. Worries that his behavior would again cause embarrassment during the games turned out to be unfounded and, despite rants and raves at officials, Knight was considered to be on his best behavior. He led the United States to victory in all eight games and to a gold medal. Doing so, Knight joined Pete Newell and Dean Smith as the only three coaches to win an NCAA title, NIT title, and Olympic gold. Life after coaching. In 2008, ESPN hired Knight as a studio analyst and occasional color commentator. In November 2012, he called an Indiana men's basketball game for the first time, something he had previously refused to do. Then-men's basketball coach Tom Crean reached out to Knight in an attempt to get him to visit the school again. On April 2, 2015, ESPN announced that it would not renew its contract with Knight. On February 27, 2019, Don Fischer, an IU radio announcer since 1974, said during an interview that Knight was in ill health. He continued by saying Knight's health "has declined" but did not offer any specifics. On April 4, 2019, Knight made his first public appearance after Fischer made his comments. He appeared with longtime friend and journalist Bob Hammel and spoke about different aspects of his career. During the presentation, Knight seemed to struggle with his memory: he re-introduced his wife to the audience after doing so only 10 minutes earlier, he mistakenly said that former IU basketball player Landon Turner had died, and, after telling a story about Michael Jordan, he later told the same story, replacing Jordan with former IU basketball player Damon Bailey. Knight and his wife resided in Lubbock, Texas, even after his retirement. On July 10, 2019, the "Indiana Daily Student", IU's campus newspaper, reported that Knight and his wife had purchased a home in Bloomington for $572,500, suggesting that Knight had decided to return to Bloomington to live. Coaching philosophy. Knight was an innovator of the motion offense, which he perfected and popularized. The system emphasizes post players setting screens and perimeter players passing the ball until a teammate becomes open for an uncontested jump shot or lay-up. This required players to be unselfish, disciplined, and effective in setting and using screens to get open. Knight's motion offense did not take shape until he began coaching at Indiana. Prior to that, at Army, he ran a "reverse action" offense that involved reversing the ball from one side of the floor to the other and screening along with it. According to Knight, it was a "West Coast offense" that Pete Newell used exclusively during his coaching career. After being exposed to the Princeton offense, Knight instilled more cutting with the offense he employed, which evolved into the motion offense that he ran for most of his career. Knight continued to develop the offense, instituting different cuts over the years and putting his players in different scenarios. Knight was well known for the extreme preparation he put into each game and practice. He was often quoted as saying, "Most people have the will to win, few have the will to prepare to win." Often during practice, Knight would instruct his players to a certain spot on the floor and give them options of what to do based on how the defense might react. In contrast to set plays, Knight's offense was designed to react according to the defense. The three-point shot was adopted by the NCAA in 1986, which was midway through Knight's coaching career. Although he opposed the rule change throughout his life, it did complement his offense well by improving the spacing on the floor. He sardonically said at the time that he supported institution of the three-point shot because if a team's offense was functioning efficiently enough to get a layup, the team should be rewarded with three points for that basket. Knight's offense also emphasized a two-count. Players in the post are expected to try to post in the paint for two seconds and if they do not receive the ball they go set a screen. Players with the ball are expected to hold the ball for two seconds to see where they are going to take it. Screens are supposed to be held for two seconds, as well. On defense Knight was known for emphasizing tenacious "man-to-man" defense where defenders contest every pass and every shot, and help teammates when needed. However, Knight also incorporated a zone defense periodically after eschewing it for the first two decades of his coaching career. Knight's coaching also included a firm emphasis on academics. All but four of his four-year players completed their degrees, or nearly 98 percent. Nearly 80 percent of his players graduated; this figure was much higher than the national average of 42 percent for Division 1 schools. Legacy. Accomplishments. Knight's all time coaching record is 902–371. His 902 wins in NCAA Division I men's college basketball games is fourth all-time to Knight's former player and former Duke head coach Mike Krzyzewski, Syracuse head coach Jim Boeheim, and North Carolina head Coach Roy Williams. Knight achieved his 880th career win on January 1, 2007, and passed retired North Carolina coach Dean Smith for most career victories, a title he held until his win total was surpassed by Krzyzewski on November 15, 2011. It was later surpassed by Boeheim on January 2, 2013, and by Williams on March 11, 2021. Knight is the youngest coach to reach 200, 300, and 400 wins, as well as among the youngest to reach other milestones of 500, 600, 700, and 800 wins. Texas Tech's participation in the 2007 NCAA tournament gave Knight more NCAA tournament appearances than any other coach. He is the only coach to win the NCAA, the NIT, an Olympic gold medal, and a Pan American Games Gold medal. Knight is also one of only three people, along with Smith and Joe B. Hall, who have both played on and coached an NCAA Tournament championship basketball team. Recognition. Knight received a number of personal honors during and after his coaching career. He was named the National Coach of the Year four times (1975, 1976, 1987, 1989) and Big Ten Coach of the Year eight times (1973, 1975, 1976, 1980, 1981, 1989, 1992, 1993). In 1975 he was a unanimous selection as National Coach of the Year, an honor he was accorded again in 1976 by the Associated Press. In 1987 he was the first person to be honored with the Naismith Coach of the Year Award. In 1989 he garnered National Coach of the Year honors by the AP, UPI, and the United States Basketball Writers Association. Knight was inducted into the Basketball Hall of Fame in 1991. On November 17, 2006, Knight was recognized for his impact on college basketball as a member of the founding class of the National Collegiate Basketball Hall of Fame. The following year, he was the recipient of the Naismith Award for Men's Outstanding Contribution to Basketball. Knight was also inducted into the Army Sports Hall of Fame (Class of 2008) and the Indiana Hoosiers athletics Hall of Fame (Class of 2009). In August 2003, he was honored as the first inductee in The Vince Lombardi Titletown Legends. Three banners were hung at Simon Skjodt Assembly Hall as a result of the three national championship wins led by Knight. Coaching tree. A number of Knight's assistant coaches, players, and managers have gone on to be coaches. In the college ranks, this includes Hall of Fame Duke coach Mike Krzyzewski, Steve Alford, Murry Bartow, Dan Dakich, Bob Donewald, Marty Simmons, Jim Crews, Chris Beard, Matt Bowen and Dusty May. Among NBA coaches, they include Randy Wittman, Mike Woodson, Keith Smart, Isiah Thomas, and Lawrence Frank. In the media. Books about Knight. In 1986, author John Feinstein published "A Season on the Brink", which detailed the 1985–86 season of the Indiana Hoosiers. Granted almost unprecedented access to the Indiana basketball program, as well as insights into Knight's private life, the book quickly became a major bestseller and spawned a new genre, as a legion of imitators wrote works covering a single year of a sports franchise. In the book Feinstein depicts a coach who is quick with a violent temper, but also one who never cheats and strictly follows all of the NCAA's rules. Two years later, author Joan Mellen penned the book "Bob Knight: His Own Man" (), in part to rebut Feinstein's "A Season on the Brink". Mellen deals with seemingly all the causes celebres in Knight's career and presents the view that he is more sinned against than sinning. In 1990, Robert P. Sulek wrote "Hoosier Honor: Bob Knight and Academic Success at Indiana University" which discusses the academic side of the basketball program. The book details all of the players that have played for Knight and what degree they earned. A number of close associates and friends of Knight have also written books about him. Former player and current Nevada Wolf Pack head basketball coach Steve Alford wrote "Playing for Knight: My Six Seasons with Bobby Knight", published in 1990. Former player Kirk Haston wrote "Days of Knight: How the General Changed My Life," published in 2016. Knight's autobiography, written with longtime friend and sports journalist Bob Hammel, was titled "Knight: My Story" and published in 2003. Three years later Steve Delsohn and Mark Heisler wrote "Bob Knight: An Unauthorized Biography". In 2013, Knight and Bob Hammel published "The Power of Negative Thinking: An Unconventional Approach to Achieving Positive Results". Knight discussed his approach to preparing for a game by anticipating all of the things that could go wrong and trying to prevent it or having a plan to deal with it. In the book Knight also shared one of his favorite sayings, "Victory favors the team making the fewest mistakes." In 2017, sports reporter Terry Hutchens published "Following the General: Why Three Coaches Have Been Unable to Return Indiana Basketball to Greatness" which discussed Knight's coaching legacy with Indiana and how none of the coaches following him have been able to reach his level of success. Film and television. Knight appeared or was featured in numerous films and television productions. In 1994 a feature film titled "Blue Chips" featured a character named Pete Bell, a volatile but honest college basketball coach under pressure to win who decides to blatantly violate NCAA rules to field a competitive team after a sub-par season. It starred Nick Nolte as Bell and NBA star Shaquille O'Neal as Neon Bodeaux, a once-in-a-lifetime player that boosters woo to his school with gifts and other perks. The coach's temper and wardrobe were modeled after Knight's, though at no time had Knight been known to illegally recruit. Knight himself appears in the film and coaches against Nolte in the film's climactic game. ESPN's first feature-length film was "A Season on the Brink", a 2002 TV adaptation from John Feinstein's book. In the film Knight is played by Brian Dennehy. ESPN also featured Knight in a reality show titled "Knight School", which followed a handful of Texas Tech students as they competed for the right to join the basketball team as a non-scholarship player. Knight made a cameo appearance as himself in the 2003 film "Anger Management". In 2008, Knight appeared in a commercial as part of Volkswagen's Das Auto series where Max, a 1964 black Beetle, interviews famous people. When Knight talked about Volkswagen winning the best resale value award in 2008, Max replied, "At least one of us is winning a title this year." This prompted Knight to throw his chair off the stage and walk out saying, "I may not be retired." Knight also made an appearance in a TV commercial for "" and "Risky Business" with fellow coaches Mike Krzyzewski, Rick Pitino, and Roy Williams. In 2009, Knight produced three instructional coaching DVD libraries—on motion offense, man-to-man defense, and instilling mental toughness—with Championship Productions. Personal life and death. Knight married the former Nancy Falk on April 17, 1963. They had two sons, Tim and Pat. The couple divorced in 1985. Pat played at Indiana from 1991 to 1995 and was head coach at Lamar from the time of his father's retirement until he was dismissed in 2014. Pat Knight coached Texas Tech after his father's retirement before he moved to Lamar. In 1988, Knight married his second wife, Karen Vieth Edgar, a former Oklahoma high school basketball coach. Knight had a high regard for education and made generous donations to the schools he was a part of, particularly libraries. At Indiana University Knight endowed two chairs, one in history and one in law. He also raised nearly $5 million for the Indiana University library system by championing a library fund to support the library's activities. The fund was ultimately named in his honor. When Knight came to Texas Tech in 2001, he gave $10,000 to the library, while his wife gave $25,000, donations which included the first gifts to the Coach Knight Library Fund which has now collected over $300,000. Later, in 2005, Knight donated an additional $40,000 to the library. On November 29, 2007, the Texas Tech library honored this with "A Legacy of Giving: The Bob Knight Exhibit". On April 18, 2011, video surfaced showing Knight responding to a question concerning John Calipari and Kentucky's men's basketball team by stating that in the previous season, Kentucky made an Elite Eight appearance with "five players who had not attended a single class that semester." These claims were later disproven by the university and the players in question, including Patrick Patterson, who graduated in three years, and John Wall, who finished the semester with a 3.5 GPA. Knight later apologized for his comments stating, "My overall point is that 'one-and-dones' are not healthy for college basketball. I should not have made it personal to Kentucky and its players and I apologize." Knight supported Donald Trump's 2016 presidential campaign, and later made an appearance at his rally in Indianapolis for the 2018 midterms. At the rally, Knight called Trump "a great defender of the United States of America". Knight died in Bloomington, Indiana, on November 1, 2023, at age 83 and his remains were cremated and buried in Orrville, Ohio.
4874
41143833
https://en.wikipedia.org/wiki?curid=4874
Black metal
Black metal is an extreme subgenre of heavy metal music. Common traits include fast tempos, a shrieking vocal style, heavily distorted guitars played with tremolo picking, raw (lo-fi) recording, unconventional song structures, and an emphasis on atmosphere. Artists often appear in corpse paint and adopt pseudonyms. Venom initiated the "first wave" of black metal, with their 1982 album "Black Metal" giving it its name. In the following years, the style was developed by Bathory, Mercyful Fate, Hellhammer and Celtic Frost. By 1987, this wave had declined, but influential works were released by Tormentor, Sarcófago, Parabellum, Blasphemy, Samael and Rotting Christ. A "second wave" arose in the early 1990s, spearheaded by bands in the early Norwegian black metal scene, such as Mayhem, Darkthrone, Burzum, Immortal, Emperor, Satyricon and Gorgoroth. This Norwegian scene did much to define black metal as a distinct genre, and inspired other scenes in Finland (Beherit, Archgoat, Impaled Nazarene); Sweden (Dissection, Marduk, Abruptum, Nifelheim); the United States (Profanatica, Demoncy, Judas Iscariot, Grand Belial's Key); France (Mütiilation, Vlad Tepes); as well as leading to the founding of influential bands in other countries, including Sigh and Cradle of Filth. Black metal has often sparked controversy. Common themes in the genre are misanthropy, anti-Christianity, Satanism, and ethnic paganism. In the 1990s, members of the scene were responsible for a spate of church burnings and murders. There is also a small neo-Nazi movement within black metal, although it has been shunned by many prominent artists. Generally, black metal strives to remain an underground phenomenon. Characteristics. Although black metal now typically refers to the Norwegian style with shrieking vocals, the term has been applied to bands with widely differing sounds, such as the Greek and Finnish bands that emerged around the same time as the Norwegian scene. Instrumentation and song structure. Manish Agarwal of "Time Out" describes black metal as "a cult strain of ultra-thrash" characterized by "icy noise". Norwegian-inspired black metal guitarists usually favor high-pitched or trebly guitar tones and heavy distortion. The guitar is usually played with fast, un-muted tremolo picking and power chords. Guitarists often use dissonance—along with specific scales, intervals and chord progressions—to create a sense of dread. The tritone, or flat-fifth, is often used. Guitar solos and low guitar tunings are rare in black metal. The bass guitar is seldom used to play stand-alone melodies. It is common for the bass to be muted against the guitar, or for it to homophonically follow the low-pitched riffs of the guitar. While electronic keyboards are not a standard instrument, some bands, like Dimmu Borgir, use keyboards "in the background" or as "proper instruments" for creating atmosphere. Some newer black metal bands began raising their production quality and introducing additional instruments such as synthesizers and even orchestras. The drumming is usually fast and relies on double-bass and blast beats to maintain tempos that sometimes approach 300 beats per minute. These fast tempos require great skill and physical stamina, typified by black metal drummers Frost (Kjetil-Vidar Haraldstad) and Hellhammer (Jan Axel Blomberg). Even still, authenticity is still prioritized over technique. "This professionalism has to go," insists well-respected drummer Fenriz (Gylve Fenris Nagell) of Darkthrone. "I want to "de-learn" playing drums, I want to play primitive and simple, I don't want to play like a drum solo all the time and make these complicated riffs". Black metal songs often stray from conventional song structure and often lack clear verse-chorus sections. Instead, many black metal songs contain lengthy and repetitive instrumental sections. The Greek style—established by Rotting Christ, Necromantia and Varathron—has more death metal traits than Norwegian black metal. Vocals and lyrics. Traditional black metal vocals are raspy and high-pitched, and include shrieking, screaming, and snarling. This vocal style influenced by Quorthon of Bathory. Death growls are sometimes used, but less often than the characteristic black metal shriek. Manish Agarwal of "Time Out" describes the lyrical content of black metal as "sacrilegious bile". Typically, lyrics are against Christianity and other institutional religions, often using apocalyptic language and include anti-authoritarian and anti-establishment messages against religious governments. Satanic lyrics are common, and many see them as essential to black metal. For Satanist black metal artists, "Black metal songs are meant to be like Calvinist sermons; deadly serious attempts to unite the true believers". Misanthropy, global catastrophe, war, death and rebirth are also common themes. Another common theme is that of the wild and extreme aspects of the natural world, particularly the wilderness, forests, mountains, winter, storms, and blizzards. Black metal also has a fascination with the distant past. Many bands write about the mythology and folklore of their homelands and some promote a revival of pre-Christian, pagan traditions. A significant number of bands write lyrics only in their native language and a few (e.g. Arckanum and early Ulver) have lyrics in archaic languages. Some doom metal-influenced artists' lyrics focus on depression, nihilism, introspection, self-harm and suicide. Imagery and performances. Many bands choose not to play live. Many of those who do play live maintain that their performances "are not for entertainment or spectacle. Sincerity, authenticity and extremity are valued above all else". Some bands consider their concerts to be rituals and often make use of stage props and theatrics. Bands such as Mayhem, Gorgoroth, and Watain are noted for their controversial shows, which have featured impaled animal heads, mock crucifixions, medieval weaponry and band members doused in animal blood. A few vocalists, such as Dead, Maniac and Kvarforth, are known for cutting themselves while singing onstage. Black metal artists often appear dressed in black with combat boots, bullet belts, spiked wristbands and inverted crosses and inverted pentagrams. However, the most widely-known trait is their use of corpse paint—black and white face paint sometimes mixed with real or fake blood, which is used to create a corpse-like or demonic appearance. Black metal has a distinct imagery. In the early 1990s, most pioneering black metal artists had minimalist album covers featuring xeroxed black-and-white pictures and logos. This was partly a reaction against death metal bands, who at that time had begun to use brightly colored album artwork. Many purist black metal artists have continued this style. Black metal album covers are typically dark and tend to be atmospheric or provocative; some feature natural or fantasy landscapes (for example Burzum's "Filosofem" and Emperor's "In the Nightside Eclipse") while others are violent, sexually transgressive, sacrilegious, or iconoclastic (for example Marduk's "Fuck Me Jesus" and Dimmu Borgir's "In Sorte Diaboli"). Production. The earliest black metal artists had very limited resources, which meant that recordings were often made in homes or basements, giving their recordings a distinctive "lo-fi" quality. However, even when success allowed access to professional studios, many artists instead chose to continue making lo-fi recordings. Artists believed that by doing so, they would both stay true to the genre's underground roots as well as make the music sound more "raw" or "cold". A well-known example of this approach is on the album "Transilvanian Hunger" by Darkthrone, a band who Johnathan Selzer of "Terrorizer" magazine says "represent the DIY aspect of black metal." In addition, lo-fi production was used to keep black metal inaccessible or unappealing to mainstream music fans and those who are not committed. Many have claimed that black metal was originally intended only for those who were part of the scene and not for a wider audience. Former Gorgoroth vocalist Gaahl said that during the genre's infancy, black metal "was never meant to reach [a wide] audience," and that creating it was "purely for [the members of the scene's] own satisfaction." History. Roots. Occult and Satanic themes were present in the music of heavy metal and rock bands of the late 1960s and early 1970s, such as Black Sabbath and Coven. In the late 1970s, the rough and aggressive heavy metal played by the British band Motörhead gained popularity. Many first-wave black metal bands cited Motörhead as an influence. Also popular in the late 1970s, punk rock came to influence the birth of black metal. Tom G. Warrior of Hellhammer and Celtic Frost credited English punk group Discharge as "a revolution, much like Venom", saying, "When I heard the first two Discharge records, I was blown away. I was just starting to play an instrument and I had no idea you could go so far." The use of corpse paint in black metal was mainly influenced by the American 1970s rock band Kiss. First wave (1982–1990). The term "black metal" was coined by the English band Venom with their second album "Black Metal" (1982). Playing a style of speed metal or proto-thrash metal, the album initiated the "first wave of black metal". The band introduced many tropes that became ubiquitous in the genre, such as blasphemous lyrics and imagery, as well as stage names, costumes and face paint. During this "first wave" of bands, black metal and other extreme metal styles like death metal were not well-defined genres. Swiss band Hellhammer made "truly raw and brutal music" with Satanic lyrics, and became an important influence on black metal. They recorded three demos in 1983 and released an EP in April 1984. Hellhammer then transformed into Celtic Frost and released their first album, "Morbid Tales", later that year. With their second album, "To Mega Therion" (1985), the band began to explore "more orchestral and experimental territories." In these first years, Celtic Frost was considered one of the world's most extreme and original metal bands, significantly influencing the black metal genre. Swedish band Bathory created "the blueprint for Scandinavian black metal" and have been described as "the biggest inspiration for the Norwegian black metal movement of the early nineties". Their songs first appeared on the compilation "Scandinavian Metal Attack" in March 1984, which drew much attention to the band, and they released their first album that October. Bathory's music was dark, raw, exceptionally fast, heavily distorted, and anti-Christian, and frontman Quorthon pioneered the shrieked vocals that later came to define black metal. Their third album "Under the Sign of the Black Mark" (1987) was described by journalist Dayal Patterson as creating "the black metal sound as we know it". The Danish band Mercyful Fate influenced the Norwegian scene with their imagery and lyrics. Frontman King Diamond, who wore ghoulish black-and-white facepaint on stage, may be one of the inspirators of what became known as 'corpse paint'. Other artists that were a part of this wave included Germany's Sodom, Kreator and Destruction, Italy's Bulldozer and Death SS, and Japan's Sabbat. In 1987, in the fifth issue of his "Slayer" fanzine, Jon 'Metalion' Kristiansen wrote that "the latest fad of black/Satanic bands seems to be over", citing United States bands Incubus and Morbid Angel, as well as Sabbat from Great Britain as some of the few continuing the genre. However, black metal continued in the underground, with scenes developing in Brazil with Sepultura, Vulcano, Holocausto and Sarcófago; in Czechoslovakia with Root, Törr and Master's Hammer; and Sweden with Grotesque, Merciless, Mefisto, Tiamat and Morbid. Sarcófago's debut album "I.N.R.I." (1987), was widely influential on subsequent acts in the genre, especially the second wave Norwegian scene and groups in the war metal style. "BrooklynVegan" writer Kim Kelly calling it "a gigantic influence on black metal's sound, aesthetics, and attitude." Furthermore, during this time other influential records in the genre were released by Tormentor (from Hungary), Parabellum (from Colombia), Von (from the United States), Rotting Christ (from Greece), Mortuary Drape (from Italy), Kat (from Poland), Samael (from Switzerland) and Blasphemy (from Canada). Blasphemy's debut album "Fallen Angel of Doom" (1990) is considered one of the most influential records for the war metal style. Fenriz of the Norwegian band Darkthrone called Master's Hammer's debut album "Ritual" "the first Norwegian black metal album, even though they are from Czechoslovakia". It was only during this post–1987 era of bands that the various extreme metal styles began to become more distinct from one another, and the borders were drawn of what is now understood as black metal. Second wave (1991–1997). The second wave of black metal began in the early 1990s. It is generally deemed to have begun with the early Norwegian black metal scene, although "Rock Hard" magazine credits Samael's "Worship Him" (1 April 1991) as its beginning. Norwegian scene. During , a number of Norwegian artists began performing and releasing a new kind of black metal music; this included Mayhem, Darkthrone, Burzum, Immortal, Emperor, Satyricon, Enslaved, Thorns, and Gorgoroth. They crystallized the styles of the "first wave" bands into a distinct genre, popularizing a style of guitar playing developed by Snorre 'Blackthorn' Ruch and Øystein 'Euronymous' Aarseth. Fenriz of Darkthrone described it as being "derived from Bathory" and noted that "those kinds of riffs became the new order for a lot of bands in the '90s". The wearing of corpse paint became standard, and was a way for many black metal artists to distinguish themselves from other metal bands of the era. The scene also had an ideology and ethos. Artists were bitterly opposed to Christianity and presented themselves as misanthropic Devil worshippers who wanted to spread terror, hatred and evil. They professed to be serious in their views and vowed to act on them. Ihsahn of Emperor said that they sought to "create fear among people" and "be in opposition to society". The scene was exclusive and created boundaries around itself, incorporating only those who were "true" and attempting to expel all "poseurs". Some members of the scene were responsible for a spate of church burnings and murder, which eventually drew attention to it and led to a number of artists being imprisoned. Dead's suicide. On 8 April 1991, Mayhem vocalist Per "Dead" Ohlin committed suicide while left alone in a house shared by the band. Fellow musicians described Dead as odd, introverted and depressed. Mayhem's drummer, Hellhammer, said that Dead was the first to wear the distinctive corpse paint that became widespread in the scene. He was found with slit wrists and a shotgun wound to the head. Dead's suicide note began with "Excuse all the blood", and apologized for firing the weapon indoors. Before calling the police, Euronymous got a disposable camera and photographed the body, after re-arranging some items. One of these photographs was later used as the cover of a bootleg live album, "Dawn of the Black Hearts". Euronymous made necklaces with bits of Dead's skull and gave some to musicians he deemed worthy. Rumors also spread that he had made a stew with bits of his brain. Euronymous used Dead's suicide to foster Mayhem's evil image and claimed Dead had killed himself because extreme metal had become trendy and commercialized. Mayhem bassist Jørn 'Necrobutcher' Stubberud noted that "people became more aware of the black metal scene after Dead had shot himself ... I think it was Dead's suicide that really changed the scene". "Metal Hammer" writer Enrico Ahlig suggests that the notoriety surrounding the suicide marked the beginning of the second wave of black metal. Helvete and Deathlike Silence. During May–June 1991, Euronymous of Mayhem opened an independent record shop named "Helvete" (Norwegian for "Hell") in Oslo. It quickly became the focal point of Norway's emerging black metal scene and a meeting place for many of its musicians; especially the members of Mayhem, Burzum, Emperor and Thorns. Jon 'Metalion' Kristiansen, writer of the fanzine "Slayer", said that the opening of Helvete was "the creation of the whole Norwegian black metal scene". In its basement, Euronymous founded an independent record label named Deathlike Silence Productions. With the rising popularity of his band and others like it, the underground success of Euronymous's label is often credited for encouraging other record labels, who had previously shunned black metal acts, to then reconsider and release their material. Church burnings. In 1992, members of the Norwegian black metal scene began a wave of arson attacks on Christian churches. By 1996, there had been at least 50 such attacks in Norway. Some of the buildings were hundreds of years old and seen as important historical landmarks. The first to be burnt down was Norway's Fantoft Stave Church. Police believe Varg Vikernes of Burzum was responsible. The cover of Burzum's EP "Aske" ("ashes") is a photograph of the destroyed church. In May 1994, Vikernes was found guilty for burning down the Holmenkollen Chapel, Skjold Church, and Åsane Church. In addition, he was found guilty for an attempted arson of a fourth church, and for the theft and storage of 150 kg of explosives. To coincide with the release of Mayhem's "De Mysteriis Dom Sathanas", Vikernes and Euronymous had also allegedly plotted to bomb the Nidaros Cathedral, which appears on the album cover. The musicians Faust, Samoth, (both of Emperor), and Jørn Inge Tunsberg (of Hades Almighty) were also convicted for church arsons. Members of the Swedish black metal scene started to burn churches as well in 1993. Those convicted for church burnings showed no remorse and described their actions as a symbolic "retaliation" against Christianity in Norway. Mayhem drummer Hellhammer said he had called for attacks on mosques and Hindu temples, on the basis that they were more foreign. Today, opinions on the church burnings differ within the black metal community. Many members of the early Norwegian scene, such as Infernus and Gaahl of Gorgoroth, continue to praise the church burnings, with the latter saying "there should have been more of them, and there will be more of them". Others, such as Necrobutcher and Kjetil Manheim of Mayhem and Abbath of Immortal, see the church burnings as having been futile. Manheim said that many arsons were "just people trying to gain acceptance" within the black metal scene. Watain vocalist Erik Danielsson respected the attacks, but said of those responsible: "the only Christianity they defeated was the last piece of Christianity within themselves. Which is a very good beginning, of course". Murder of Euronymous. In early 1993, animosity arose between Euronymous and Vikernes. On the night of 10 August 1993, Varg Vikernes (of Burzum) and Snorre 'Blackthorn' Ruch (of Thorns) drove from Bergen to Euronymous's apartment in Oslo. When they arrived a confrontation began and Vikernes stabbed Euronymous to death. His body was found outside the apartment with 23 cut wounds—two to the head, five to the neck, and sixteen to the back. It has been speculated that the murder was the result of either a power struggle, a financial dispute over Burzum records or an attempt at outdoing a stabbing in Lillehammer the year before by Faust. Vikernes denies all of these, claiming that he attacked Euronymous in self-defense. He says that Euronymous had plotted to stun him with an electroshock weapon, tie him up and torture him to death while videotaping the event. He said Euronymous planned to use a meeting about an unsigned contract to ambush him. Vikernes claims he intended to hand Euronymous the signed contract that night and "tell him to fuck off", but that Euronymous panicked and attacked him first. He also claims that most of the cuts were from broken glass Euronymous had fallen on during the struggle. The self-defense story is doubted by Faust, while Necrobutcher confirmed that Vikernes killed Euronymous in self-defense due to the death threats he received from him. Vikernes was arrested on 19 August 1993, in Bergen. Many other members of the scene were taken in for questioning around the same time. Some of them confessed to their crimes and implicated others. In May 1994, Vikernes was sentenced to 21 years in prison (Norway's maximum penalty) for the murder of Euronymous, the arson of four churches, and for possession of 150 kg of explosives. However, he only confessed to the latter. Two churches were burnt the day he was sentenced, "presumably as a statement of symbolic support". Vikernes smiled when his verdict was read and the picture was widely reprinted in the news media. Blackthorn was sentenced to eight years in prison for being an accomplice to the murder. That month saw the release of Mayhem's album "De Mysteriis Dom Sathanas", which featured Euronymous on guitar and Vikernes on bass guitar. Euronymous's family had asked Mayhem's drummer, Hellhammer, to remove the bass tracks recorded by Vikernes, but Hellhammer said: "I thought it was appropriate that the murderer and victim were on the same record. I put word out that I was re-recording the bass parts. But I never did". In 2003, Vikernes failed to return to Tønsberg prison after being given a short leave. He was re-arrested shortly after while driving a stolen car with various weapons. Vikernes was released on parole in 2009. Outside of Norway. Japanese band Sigh formed in 1990 and was in regular contact with key members of the Norwegian scene. Their debut album, "Scorn Defeat", became "a cult classic in the black metal world". In 1990 and 1991, Northern European metal acts began to release music influenced by these bands or the older ones from the first wave. In Sweden, this included Dissection, Abruptum, Marduk, and Nifelheim. In Finland, there emerged a scene that mixed the first-wave black metal style with elements of death metal and grindcore; this included Beherit, Archgoat and Impaled Nazarene, whose debut album "Tol Cormpt Norz Norz Norz" Rock Hard journalist Wolf-Rüdiger Mühlmann considers a part of war metal's roots. Bands such as Demoncy and Profanatica emerged during this time in the United States, when death metal was more popular among extreme metal fans. The Norwegian band Mayhem's concert in Leipzig with Eminenz and Manos in 1990, later released as "Live in Leipzig", was said to have had a strong influence on the East German scene and is even called the unofficial beginning of German black metal. Black metal scenes also emerged on the European mainland during the early 1990s, inspired by the Norwegian scene or the older bands, or both. In Poland, a scene was spearheaded by Graveland and Behemoth. In France, a close-knit group of musicians known as Les Légions Noires emerged; this included artists such as Mütiilation, Vlad Tepes, Belketre and Torgeist. In Belgium, there were acts such as Ancient Rites and Enthroned. Bands such as Black Funeral, Grand Belial's Key and Judas Iscariot emerged during this time in the United States. Black Funeral, from Houston, formed in 1993, was associated with black magic and Satanism. A notable black metal group in England was Cradle of Filth, who released three demos in a black/death metal style with symphonic flourishes, followed by the album "The Principle of Evil Made Flesh", which featured a then-unusual hybrid style of black and gothic metal. The band then abandoned black metal for gothic metal, becoming one of the most successful extreme metal bands to date. John Serba of AllMusic commented that their first album "made waves in the early black metal scene, putting Cradle of Filth on the tips of metalheads' tongues, whether in praise of the band's brazen attempts to break the black metal mold or in derision for its 'commercialization' of an underground phenomenon that was proud of its grimy heritage". Some black metal fans did not consider Cradle of Filth to be black metal. When asked if he considers Cradle of Filth a black metal band, vocalist Dani Filth said he considers them black metal in terms of philosophy and atmosphere, but not in other ways. Another English band called Necropolis never released any music, but "began a desecratory assault against churches and cemeteries in their area" and "almost caused Black Metal to be banned in Britain as a result". Dayal Patterson says successful acts like Cradle of Filth "provoked an even greater extremity [of negative opinion] from the underground" scene due to concerns about "selling out". The controversy surrounding the Thuringian band Absurd drew attention to the German black metal scene. In 1993, the members murdered a boy from their school, Sandro Beyer. A photo of Beyer's gravestone is on the cover of one of their demos, "Thuringian Pagan Madness", along with pro-Nazi statements. It was recorded in prison and released in Poland by Graveland drummer Capricornus. The band's early music was more influenced by Oi! and Rock Against Communism (RAC) than by black metal, and described as being "more akin to '60s garage punk than some of the […] Black Metal of their contemporaries". Alexander von Meilenwald from German band Nagelfar considers Ungod's 1993 debut "Circle of the Seven Infernal Pacts", Desaster's 1994 demo "Lost in the Ages", Tha-Norr's 1995 album "Wolfenzeitalter", Lunar Aurora's 1996 debut "Weltengänger" and Katharsis's 2000 debut "666" to be the most important recordings for the German scene. He said they were "not necessarily the best German releases, but they all kicked off something". After the second wave (1998–present). In the beginning of the second wave, the different scenes developed their own styles; as Alan 'Nemtheanga' Averill says, "you had the Greek sound and the Finnish sound, and the Norwegian sound, and there was German bands and Swiss bands and that kind of thing." By the mid-1990s, the style of the Norwegian scene was being adopted by bands worldwide, and in 1998, "Kerrang!" journalist Malcolm Dome said that "black metal as we know it in 1998 owes more to Norway and to Scandinavia than any other particular country". Newer black metal bands also began raising their production quality and introducing additional instruments such as synthesizers and even full-symphony orchestras. By the late 1990s, the underground concluded that several of the Norwegian pioneers—like Emperor, Immortal, Dimmu Borgir, Ancient, Covenant/The Kovenant, and Satyricon—had commercialized or sold out to the mainstream and "big bastard labels." Dayal Patterson states that successful acts like Dimmu Borgir "provoked and even greater extremity [of negative opinion] from the underground" regarding the view that these bands had "sold out." After Euronymous's death, "some bands went more towards the Viking metal and epic style, while some bands went deeper into the abyss." Since 1993, the Swedish scene had carried out church burnings, grave desecration, and other violent acts. In 1995, Jon Nödtveidt of Dissection joined the Misanthropic Luciferian Order (MLO). In 1997, he and another MLO member were arrested and charged with shooting dead a 37-year-old man. It was said he was killed "out of anger" because he had "harassed" the two men. Nödtveidt received a 10-year sentence. As the victim was a homosexual immigrant, Dissection was accused of being a Nazi band, but Nödtveidt denied this and dismissed racism and nationalism. The Swedish band Shining, founded in 1996, began writing music almost exclusively about depression and suicide, musically inspired by Strid and by Burzum's albums "Hvis lyset tar oss" and "Filosofem". Vocalist Niklas Kvarforth wanted to "force-feed" his listeners "with self-destructive and suicidal imagery and lyrics." In the beginning, he used the term "suicidal black metal" for his music. However, he stopped using the term in 2001 because it had begun to be used by a slew of other bands, whom he felt had misinterpreted his vision and were using the music as a kind of therapy rather than a weapon against the listener as Kvarforth intended. He said that he "wouldn't call Shining a black metal band" and called the "suicidal black metal" term a "foolish idea." According to Erik Danielsson, when his band Watain formed in 1998, there were very few bands who took black metal as seriously as the early Norwegian scene had. A newer generation of Swedish Satanic bands like Watain and Ondskapt, inspired by Ofermod, the new band of Nefandus member Belfagor, put this scene "into a new light." Kvarforth said, "It seems like people actually [got] afraid again." "The current Swedish black metal scene has a particularly ambitious and articulate understanding of mysticism and its validity to black metal. Many Swedish black metal bands, most notably Watain and Dissection, are [or were] affiliated with the Temple of the Black Light, or Misanthropic Luciferian Order […] a Theistic, Gnostic, Satanic organization based in Sweden". Upon his release in 2004, Jon Nödtveidt restarted Dissection with new members whom he felt were able to "stand behind and live up to the demands of Dissection's Satanic concept." He started calling Dissection "the sonic propaganda unit of the MLO" and released a third full-length album, "Reinkaos". The lyrics contain magical formulae from the "Liber Azerate" and are based on the organization's teachings. After the album's release and a few concerts, Nödtveidt said that he had "reached the limitations of music as a tool for expressing what I want to express, for myself and the handful of others that I care about" and disbanded Dissection before dying by suicide. A part of the underground scene adopted a Jungian interpretation of the church burnings and other acts of the early scene as the re-emergence of ancient archetypes, which Kadmon of Allerseelen and the authors of "Lords of Chaos" had implied in their writings. They mixed this interpretation with Paganism and Nationalism. Varg Vikernes was seen as "an ideological messiah" by some, although Vikernes had disassociated himself from black metal and his neo-Nazism had nothing to do with that subculture. This led to the rise of National Socialist black metal (NSBM), which Hendrik Möbus of Absurd calls "the logical conclusion" of the Norwegian black metal "movement". Other parts of the scene oppose NSBM as it is "indelibly linked with Asá Trŭ and opposed to Satanism", or look upon Nazism "with vague skepticism and indifference". Members of the NSBM scene, among others, see the Norwegian bands as poseurs whose "ideology is cheap", although they still respect Vikernes and Burzum, whom Grand Belial's Key vocalist Richard Mills called "the only Norwegian band that remains unapologetic and literally convicted of his beliefs." In France, besides Les Légions Noires (The Black Legions), an NSBM scene arose. Members of French band Funeral desecrated a grave in Toulon in June 1996, and a 19-year-old black metal fan stabbed a priest to death in Mulhouse on Christmas Eve 1996. According to MkM of Antaeus and Aosoth, the early French scene "was quite easy to divide: either you were NSBM, and you had the support from zine and the audience, or you were part of the black legions, and you had that 'cult' aura", whereas his band Antaeus, not belonging to either of these sub-scenes, "did not fit anywhere." Many French bands, like Deathspell Omega and Aosoth, have an avantgarde approach and a disharmonic sound that is representative of that scene. The early American black metal bands remained underground. Some of them—like Grand Belial's Key and Judas Iscariot—joined an international NSBM organization called the Pagan Front, although Judas Iscariot's sole member Akhenaten left the organization. Other bands like Averse Sefira never had any link with Nazism. The US bands have no common style. Many were musically inspired by Burzum but did not necessarily adopt Vikernes's ideas. Profanatica's music is close to death metal, while Demoncy were accused of ripping off Gorgoroth riffs. There also emerged bands like Xasthur and Leviathan (whose music is inspired by Burzum and whose lyrics focus on topics such as depression and suicide), Nachtmystium, Krallice, Wolves in the Throne Room (a band linked to the crust punk scene and the environmental movement), and Liturgy (the style of whom frontwoman Ravenna Hunt-Hendrix describes as 'transcendental black metal'). These bands eschew black metal's traditional lyrical content for "something more Whitman-esque" and have been rejected by some traditional black-metallers for their ideologies and the post-rock and shoegazing influences some of them have adopted. Also, some bands like Agalloch started to incorporate "doom and folk elements into the traditional blast-beat and tremolo-picking of the Scandinavian incarnation", a style that later became known as "Cascadian black metal", in reference to the region where it emerged. In Australia, a scene led by bands like Deströyer 666, Vomitor, Hobbs' Angel of Death, Nocturnal Graves and Gospel of the Horns arose. This scene's typical style is a mixture of old school black metal and raw thrash metal influenced by old Celtic Frost, Bathory, Venom, and Sodom but also with its own elements. Melechesh was formed in Jerusalem in 1993, "the first overtly anti-Christian band to exist in one of the holiest cities in the world". Melechesh began as a straightforward black metal act with their first foray into folk metal occurring on their 1996 EP "The Siege of Lachish". Their subsequent albums straddled black, death, and thrash metal. Another band, Arallu, was formed in the late 1990s and has relationships with Melechesh and Salem. Melechesh and Arallu perform a style they call "Mesopotamian Black Metal", a blend of black metal and Mesopotamian folk music. Since the 2000s, a number of anti-Islamic and anti-religious black metal bands—whose members come from Muslim backgrounds—have emerged in the Middle East. Janaza, believed to be Iraq's first female black metal artist, released the demo "Burning Quran Ceremony" in 2010. Its frontwoman, Anahita, claimed her parents and brother were killed by a suicide bomb during the Iraq War. Another Iraqi band, Seeds of Iblis, also fronted by Anahita, released their debut EP "Jihad Against Islam" in 2011 through French label Legion of Death. Metal news website Metalluminati suggests that their claims of being based in Iraq are a hoax. These bands, along with Tadnees (from Saudi Arabia), Halla (from Iran), False Allah (from Bahrain), and Mosque of Satan (from Lebanon), style themselves as the "Arabic Anti-Islamic Legion". Another Lebanese band, Ayat, drew much attention with their debut album "Six Years of Dormant Hatred", released through North American label Moribund Records in 2008. Some European bands have also begun expressing anti-Islamic views, most notably the Norwegian band Taake. Styles and subgenres. Regarding the sound of black metal, there are two conflicting groups within the genre: "those that stay true to the genre's roots, and those that introduce progressive elements". The former believe that the music should always be minimalist—performed only with the standard guitar-bass-drums setup and recorded in a low fidelity style. One supporter of this train of thought is Blake Judd of Nachtmystium, who has rejected labeling his band black metal for its departure from the genre's typical sound. Snorre Ruch of Thorns, on the other hand, has said that modern black metal is "too narrow" and believes that this was "not the idea at the beginning". Since the 1990s, different styles of black metal have emerged and some have melded Norwegian-style black metal with other genres: Ambient black metal. Ambient black metal, also known as atmospheric black metal, is a style of black metal that relies on heavy incorporation of atmospheric, sometimes dreamy textures, and is therefore less aggressive. It often features synthesizers or classical instrumentation, typically for melody or ethereal "shimmering" over the wall of sound provided by the guitars. The music is usually slow to mid paced with rare blast beat usage, without any abrupt changes and generally features slowly developing, sometimes repetitive melodies and riffs, which separate it from other black metal styles. Subject matter usually concerns nature, folklore, mythology, and personal introspection. Artists include Summoning, Agalloch and Wolves in the Throne Room. Black-doom. Black-doom, also known as blackened doom, is a style that combines the slowness and thicker, bassier sound of doom metal with the shrieking vocals and heavily distorted guitar sound of black metal. Black-doom bands maintain the Satanic ideology associated with black metal, while melding it with moodier themes more related to doom metal, like depression, nihilism and nature. They also use the slower pace of doom metal in order to emphasize the harsh atmosphere present in black metal. Examples of black-doom bands include Barathrum, Forgotten Tomb, Woods of Ypres, Deinonychus, Shining, Nortt, Bethlehem, early Katatonia, Tiamat, Dolorian, and October Tide. Depressive suicidal black metal. Pioneered by black-doom bands like Ophthalamia, Katatonia, Bethlehem, Forgotten Tomb and Shining, depressive suicidal black metal, also known as suicidal black metal, depressive black metal or DSBM, is a style that melds the second wave-style of black metal with doom metal, with lyrics revolving around themes such as depression, self-harm, misanthropy, suicide and death. DSBM bands draw on the lo-fi recording and highly distorted guitars of black metal, while employing the usage of acoustic instruments and non-distorted electric guitar timbres present in doom metal, interchanging the slower, doom-like, sections with faster tremolo picking. Vocals are usually high-pitched like in black metal, but lacking energy, simulating feelings like hopelessness, desperation, and entreaty. The presence of one-man bands is more prominent in this genre compared to others. Examples of bands include Xasthur, Leviathan, Strid, Silencer, Make a Change... Kill Yourself, Lifelover and I Shalt Become. Black 'n' roll. Black 'n' roll is a style of black metal that incorporates elements from 1970s hard rock and rock and roll music. Examples of black 'n' roll bands include Carpathian Forest, Kvelertak, Vreid, and Khold. Bands such as Satyricon, Nachtmystium, Nidingr, Craft, and Sarke also experimented with the genre. Blackened crust. Crust punk groups such as Antisect, Sacrilege and Anti System took some influence from early black metal bands like Venom, Hellhammer, and Celtic Frost, while Amebix's lead vocalist and guitarist sent his band's early demo tape to Cronos of Venom, who replied by saying "We'll rip you off." Similarly, Bathory was initially inspired by crust punk as well as heavy metal. Crust punk was affected by a second wave of black metal in the 1990s, with some bands emphasizing these black metal elements. Iskra are probably the most obvious example of second-wave black metal-influenced crust punk; Iskra coined their own phrase "blackened crust" to describe their new style. The Japanese group Gallhammer also fused crust punk with black metal while the English band Fukpig has been said to have elements of crust punk, black metal, and grindcore. North Carolina's Young and in the Way have been playing blackened crust since their formation in 2009. In addition, Norwegian band Darkthrone have incorporated crust punk traits in their more recent material. As Daniel Ekeroth wrote in 2008, Many "red and anarchist black metal" (RABM) artists are influenced by crust punk or have a background in that scene. Blackened death-doom. Blackened death-doom is a genre that combines the slow tempos and monolithic drumming of doom metal, the complex and loud riffage of death metal and the shrieking vocals of black metal. Examples of blackened death-doom bands include Morast, Faustcoven, The Ruins of Beverast, Bölzer, Necros Christos, Harvest Gulgaltha, Dragged into Sunlight, Hands of Thieves, and Soulburn. Kim Kelly, journalist from Vice, has called Faustcoven as "one of the finest bands to ever successfully meld black, death, and doom metal into a cohesive, legible whole." Blackened death metal. Blackened death metal is commonly death metal that incorporates musical, lyrical or ideological elements of black metal, such as an increased use of tremolo picking, anti-Christian or Satanic lyrical themes and chord progressions similar to those used in black metal. Blackened death metal bands are also more likely to wear corpse paint and suits of armour, than bands from other styles of death metal. Lower range guitar tunings, death growls and abrupt tempo changes are common in the genre. Examples of blackened death metal bands are Belphegor, Behemoth, Akercocke, and Sacramentum. Melodic black-death. Melodic black-death (also known as blackened melodic death metal or melodic blackened death metal) is a genre of extreme metal that describes the style created when melodic death metal bands began being inspired by black metal and European romanticism. However, unlike most other black metal, this take on the genre incorporated an increased sense of melody and narrative. Some bands who have played this style include Dissection, Sacramentum, Naglfar, God Dethroned, Dawn, Unanimated, Thulcandra, Skeletonwitch and Cardinal Sin. War metal. War metal (also known as war black metal or bestial black metal) is an aggressive, cacophonous, and chaotic subgenre of blackened death metal, described by Rock Hard journalist Wolf-Rüdiger Mühlmann as "rabid" and "hammering". Important influences include early black and death metal bands, such as Sodom, Possessed, Autopsy, Sarcófago, and the first two Sepultura releases, as well as seminal grindcore acts like Repulsion. War metal bands include Blasphemy, Archgoat, Impiety, and Bestial Warlust. Blackened grindcore. Blackened grindcore is a fusion genre that combines elements of black metal and grindcore. Notable bands include Anaal Nathrakh and early Rotting Christ. Blackened thrash metal. Blackened thrash metal, also known as black-thrash, is a fusion genre that combines elements of black metal and thrash metal. Being considered one of the first fusions of extreme metal, it was inspired by bands such as Venom, Sodom, and Sarcófago. Notable bands include Aura Noir, Witchery, Black Fast, Sathanas, and Deströyer 666. Folk black metal, pagan metal, and Viking metal. Folk black metal, pagan metal and Viking metal are styles that incorporates elements of folk music, with pagan metal bands focusing on pagan lyrics and imagery, and Viking metal bands giving thematic focus on Norse mythology, Norse paganism, and the Viking Age, more influenced by Nordic folk music. While not focused on Satanism, the bands' use of ancient folklore and mythologies still express anti-Christian views, with folk black metal doing it as part of a "rebellion to the status quo", that developed concurrently along with the rise of folk metal in Europe in the 1990s, Notable artist include Negură Bunget, Windir, Primordial, In the Woods..., Cruachan, and Bathory, to whose albums "Blood Fire Death" (1988) and "Hammerheart" (1990) the origin of Viking metal can be traced. Industrial black metal. Industrial black metal is a style of black metal that incorporates elements of industrial music. Mysticum, formed in 1991, was the first of these groups. DHG (Dødheimsgard), Thorns from Norway and Blut Aus Nord, N.K.V.D. and Blacklodge from France, have been acclaimed for their incorporation of industrial elements. Other industrial black metal musicians include Samael, The Axis of Perdition, Aborym, and ...And Oceans. In addition, The Kovenant, Mortiis and Ulver emerged from the Norwegian black metal scene, but later chose to experiment with industrial music. Post-black metal. Post-black metal is an umbrella term for genres that experiment beyond black metal's conventions and broaden their sounds, evolving past the genre's limits. Notable bands include Myrkur, Alcest, Bosse-de-Nage, and Wildernessking. Blackgaze. Blackgaze incorporates common black metal and post-black metal elements such as blast beat drumming and high-pitched screamed vocals with the melodic and heavily distorted guitar styles typically associated with shoegazing. It is associated with bands such as Deafheaven, Alcest, Vaura, Amesoeurs, Bosse-de-Nage, Oathbreaker, and Fen. Psychedelic black metal. Psychedelic black metal is a subgenre of black metal which employs the usage of psychedelic elements. Notable acts include Oranssi Pazuzu, Nachtmystium, Deafheaven, Woe, Amesoeurs, and In the Woods... Raw black metal. Raw black metal is a subgenre that seeks to amplify the primitive qualities of the second wave of black metal, by giving priority to its lo-fi production values. To achieve this, bands under this style usually emphasize the usage of higher-pitches in their guitar sound and vocals, while employing techniques such as tremolo picking and blast beats more often. Its imagery is often associated with dystopic and minimalistic tendencies. The style was pioneered by Ildjarn, with other notable bands including Gorgoroth and Darkthrone. Symphonic black metal. Symphonic black metal is a style of black metal that incorporates symphonic and orchestral elements. This may include the usage of instruments found in symphony orchestras (piano, violin, cello, flute and keyboards), "clean" or operatic vocals and guitars with less distortion. Notable bands include Emperor, Troll, Dimmu Borgir and Bal Sagoth. Ideology. Unlike other metal genres, black metal is associated with a worldview and ethos. It is fiercely opposed to Christianity and the other main institutional religions, Islam and Judaism. Many black metal bands are Satanists and see Satanism as a key part of black metal. Others advocate ethnic Paganism, "often coupled with nationalism", although the early Pagan bands did not call themselves 'black metal'. Black metal tends to be misanthropic and hostile to modern society. It is "a reaction against the mundanity, insincerity and emotional emptiness that participants feel is intrinsic to modern secular culture". Aaron Weaver from Wolves in the Throne Room said: "I think that black metal is an artistic movement that is critiquing modernity on a fundamental level, saying that the modern world view is missing something". As part of this, some parts of the scene glorify nature and have a fascination with the distant past. Black metal has been likened to Romanticism and there is an undercurrent of romantic nationalism in the genre. Sam Dunn noted that "unlike any other heavy metal scene, the culture and the place is incorporated into the music and imagery". Individualism is also an important part of black metal, with Fenriz of Darkthrone describing black metal as "individualism above all". Unlike other kinds of metal, black metal has numerous one-man bands. However, it is argued that followers of Euronymous were anti-individualistic, and that "Black Metal is characterized by a conflict between radical individualism and group identity and by an attempt to accept both polarities simultaneously". The black metal scene tends to oppose political correctness, humanitarianism, consumerism, globalization and homogeneity. In his master's thesis, Benjamin Hedge Olson wrote that some artists can be seen as transcendentalists. Dissatisfied with a "world that they feel is devoid of spiritual and cultural significance", they try to leave or "transcend" their "mundane physical forms" and become one with the divine. This is done through their concerts, which he describes as "musical rituals" that involve self-mortification and taking on an alternative, "spiritual persona" (for example, by the wearing of costume and face paint). Generally, black metal strives to remain an underground phenomenon. Satanism. Black metal was originally a term for extreme metal bands with Satanic lyrics and imagery. However, most of the 'first wave' bands (including Venom, who coined the term 'black metal') were not Satanists; rather, they used Satanic themes to provoke controversy. One of the few exceptions was Mercyful Fate singer and Church of Satan member King Diamond, whom Michael Moynihan calls "one of the only performers of the '80s Satanic metal who was more than just a poseur using a devilish image for shock value". In the early 1990s, many Norwegian black-metallers presented themselves as genuine Devil worshippers. Mayhem's Euronymous was the key figure behind this. They attacked the Church of Satan for its "freedom and life-loving" views; the theistic Satanism they espoused was an inversion of Christianity. Benjamin Hedge Olson wrote that they transformed "Venom's quasi-Satanic stage theatrics into a form of cultural expression unique from other forms of metal or Satanism" and "abandoned the mundane identities and ambitions of other forms of metal in favor of religious and ideological fanaticism". Some prominent scene members—such as Euronymous and Faust—said that only bands who are Satanists can be called 'black metal'. Bands with a Norwegian style, but without Satanic lyrics, tended to use other names for their music. Some prominent artists still hold the view that black metal should be Satanic, while others believe that black metal does not need to be Satanic. An article in Metalion's "Slayer" fanzine attacked musicians that "care more about their guitars than the actual essence onto which the whole concept was and is based upon". Bands with a similar style but with Pagan lyrics tend to be referred to as 'Pagan Metal' by many 'purist' black-metallers. Others shun Satanism, seeing it as Christian or "Judeo-Christian" in origin, and regard Satanists as perpetuating the "Judeo-Christian" worldview. Quorthon of Bathory said he used 'Satan' to provoke and attack Christianity. However, with his third and fourth albums, "Under the Sign of the Black Mark" and "Blood Fire Death", he began "attacking Christianity from a different angle", realizing that Satanism is a "Christian product". Nevertheless, some artists use Satan as a symbol or metaphor for their beliefs, such as LaVeyan Satanists (who are atheist). Vocalist Gaahl, who follows Norse paganism, said: "We use the word 'Satanist' because it is Christian world and we have to speak their language ... When I use the word 'Satan', it means the natural order, the will of a man, the will to grow, the will to become the superman". Varg Vikernes called himself a Satanist in early interviews but "now downplays his former interest in Satanism", saying he was using Satan as a symbol for Odin as the 'adversary' of the Christian God. He saw Satanism as "an introduction to more indigenous heathen beliefs". Some bands such as Carach Angren and Enslaved do not have Satanic lyrics. Christianity. 'Unblack metal' (or 'Christian black metal') promotes Christianity through its lyrics and imagery. The first unblack metal record, "Hellig Usvart" (1994) by Australian artist Horde, was a provocative parody of Norwegian black metal. It sparked controversy, and death threats were issued against Horde. Norwegian Christian metal band Crush Evil adopted a black metal style in the late 1990s and were renamed Antestor. Many black-metallers see "Christian black metal" as an oxymoron and believe black metal cannot be Christian. In fact, the early unblack metal groups Horde and Antestor refused to call their music "black metal" because they did not share its ethos. Horde called its music "holy unblack metal" and Antestor called theirs "sorrow metal". Horde's Jayson Sherlock later said "I will never understand why Christians think they can play Black Metal. I really don't think they understand what true Black Metal is". However, current unblack metal bands such as Crimson Moonlight feel that black metal has changed from an ideological movement to a purely musical genre, and thus call their music 'black metal'. Environmentalism. Black metal has a long tradition of environmentalism. Groups such as Botanist and Wolves in the Throne Room have been described as exemplifying radical environmentalism. Politics. A wide range of political views are found in the black metal scene. Black metal is generally not political music and the vast majority of bands do not express political views. The genre is seen more as "fantastical escapism", and artists usually see themselves as merely depicting the "macabre nature of the world". Ihsahn of Emperor explained: "I see it much more as an atmospheric and emotional thing rather than a political one. Hardcore bands can deal with political things; black metal is something else". However, some black metal artists promote political ideologies. National Socialist black metal (NSBM) promotes neo-Nazi or far-right politics through its lyrics and imagery. Like Nazi punk, it is "distinguished only by ideology, not musical character". Artists typically meld neo-Nazism with ethnic European paganism; however, a few meld these beliefs with Satanism or occultism. Some commentators see it as a natural development of the "black metal worldview". Some members of the early Norwegian scene flirted with Nazi imagery, but this was largely for shock value and to provoke. Varg Vikernes—who now refers to his ideology as 'Odalism'—is credited with introducing such views into the scene. Some NSBM bands support fascist Satanist groups like the Order of Nine Angles. NSBM artists are a small minority within the genre. While many black metal fans boycott Neo-Nazi artists, others are indifferent or appreciate the music without supporting the musicians. NSBM has been criticized by many prominent and influential black metal artists. Some liken Nazism to Christianity by arguing that both are authoritarian, collectivist, and a "herd mentality". Olson writes that the shunning of Nazism by some black-metallers "has nothing to do with notions of a 'universal humanity' or a rejection of hate" but that Nazism is shunned "because its hatred is too specific and exclusive". Partly in reaction to NSBM, a small number of artists began promoting left-wing politics such as anarchism or Marxism, creating a movement known as "Red and anarchist black metal" (RABM). Many artists have a background in anarchist crust punk. Artists labelled RABM include Iskra, Panopticon, Skagos, Storm of Sedition, Not A Cost, and Black Kronstadt. Some others with similar outlook, such as Wolves in the Throne Room, are not overtly political and do not endorse the label.
4875
49698580
https://en.wikipedia.org/wiki?curid=4875
Bin Laden (disambiguation)
Osama bin Laden (1957–2011) was a Saudi-born terrorist and the co-founder of al-Qaeda. bin Laden or Bin Laden may also refer to:
4876
937293
https://en.wikipedia.org/wiki?curid=4876
Blizzard Entertainment
Blizzard Entertainment, Inc. is an American video game developer and publisher based in Irvine, California, and a subsidiary of Activision Blizzard. Originally founded in 1991, the company is best known for producing the highly influential massively multiplayer online role-playing game "World of Warcraft" (2004) as well as the multi million-selling video game franchises "Diablo," "StarCraft", and "Overwatch". The company also operates Battle.net, an online gaming service. Founded as Silicon & Synapse, Inc. by three graduates of the University of California, Los Angeles: Michael Morhaime, Allen Adham, and Frank Pearce the company began development of their own software in 1993, with games like "Rock n' Roll Racing" and "The Lost Vikings", and changed its name to Chaos Studios, Inc. the same year, then to Blizzard Entertainment after being acquired by distributor Davidson & Associates in 1994; that year, the company released "", which would receive numerous sequels and led to the highly popular "World of Warcraft". By the end of the decade, Blizzard also found success with the action role-playing game "Diablo" (1997) and strategy game "StarCraft" (1998). The company became part of Vivendi Games in 1998, which would then merge with Activision in 2008, culminating in the inclusion of the Blizzard brand name in the title of the resulting holding company; Activision Blizzard became completely independent from Vivendi in 2013. Microsoft acquired Activision Blizzard in 2023, maintaining that the company will continue to operate as a separate business, while part of the larger Microsoft Gaming division; Blizzard Entertainment retains its function as the publisher of games developed by their studios. Since 2005, Blizzard Entertainment has hosted annual gaming conventions for fans to meet and to promote their games, called BlizzCon, as well as a number of global events outside the United States. In the 2010s and 2020s, Blizzard has continued development of expansion packs for "World of Warcraft" (the most recent being 2024's '), while also releasing ' (2017), "Diablo III" (2012) and "Diablo IV" (2023), as well as new material most notably the online multiplayer games "Hearthstone", a collectible card game; "Heroes of the Storm", a battle arena game; and "Overwatch" and "Overwatch 2", which are first-person shooters. Since 2018, the company's reputation has suffered from a series of poorly received games, controversies involving players and staff, and allegations of sexual harassment and other misconduct against leading Blizzard employees. History. Founding (1991–1994). Blizzard Entertainment was founded by Michael Morhaime, Allen Adham, and Frank Pearce as Silicon & Synapse in February 1991, after all three had earned their bachelor's degrees from the University of California, Los Angeles the year prior. The name "Silicon & Synapse" was a high concept from the three founders, with "silicon" representing the building block of a computer, while "synapse" the building block of the brain. The initial logo was created by Stu Rose. To fund the company, each of them contributed about $10,000, Morhaime borrowing the sum interest-free from his grandmother. Their offices were established in a business park near the University of California, Irvine in Irvine, California. During the first two years, the company focused on creating game ports for other studios. Interplay Productions' Brian Fargo was friends with Adham and had a 10% stake in Silicon & Synapse. Fargo provided the company with conversion contacts for the games Interplay was publishing, starting with "Battle Chess". Other titles included Ports include titles such as "J.R.R. Tolkien's The Lord of the Rings, Vol. I" and "Battle Chess II: Chinese Chess". Fargo then enlisted Silicon & Synapse around 1991 to help develop "RPM Racing" that Interplay was preparing for the launch of the Super Nintendo Entertainment System. Fargo remained impressed with Silicon & Synapse's work, and provided them the ability to write their own games to be published by Interplay. The first two titles developed solely by the company were "Rock n' Roll Racing", a sequel to "RPM Racing", and "The Lost Vikings" inspired by "Lemmings". Around 1993, co-founder Adham told the other executives that he did not like the name "Silicon & Synapse" anymore, as outsiders were confusing the element silicon used in microchips with silicone polymer of breast implants. By the end of 1993, Adham changed the name to "Chaos Studios," reflecting on the haphazardness of their development processes. Near this same time, the company started to explore options in publishing their own games, as their conversion contracts were not as lucrative for the company. Inspired by the multiplayer aspects of Westwood Studios' "Dune II" and the high fantasy setting of "The Lord of the Rings", the company began work on what would become "". Adham saw this as a start of a series of interconnected titles, similar to the "Gold Box" series by "Strategic Simulations". To support its development and keep the company afloat, the studio took several more conversion contracts, though the founders were going into debt to keep their twelve developers employed. Davidson & Associates, a company that published educational software and which had previously employed Silicon & Synapse for conversion contracts, made an offer to buy the company for $4 million. Interplay was negotiating to be the publisher for "Warcraft", and Fargo cautioned Adham and Morhaime against selling the company. Adham and Morhiame rejected Davidson & Associates' initial offer, but the company came back with another offer of $6.75 million (equivalent to $ million in ), assuring to the founders that they would have creative control over the games they developed. Adham and Morhaime accepted the offer in early 1994. Shortly after the sale, they were contacted by a Florida company, Chaos Technologies, who claims their trademark rights on the name "Chaos" and wanted the company to pay () to keep the name. Not wanting to pay that sum, the executives decided to change the studio's name to "Ogre Studios" by April 1994. However, Davidson & Associates did not like this name, and forced the company to change it. According to Morhaime, Adham began running through a dictionary from the start, writing down any word that seemed interesting and passing it to the legal department to see if it had any complications. One of the first words they found to be interesting and cleared the legal check was "blizzard", leading them to change their name to "Blizzard Entertainment" by May 1994. "Warcraft" was released in November 1994, and within a year, helped to establish Blizzard among other development studios like Westwood. Acquisition by Vivendi and "World of Warcraft" (1995–2007). Blizzard Entertainment has changed hands several times since then. Davidson was acquired along with Sierra On-Line by a company called CUC International in 1996. CUC then merged with a hotel, real-estate, and car-rental franchiser called HFS Corporation to form Cendant in 1997. In 1998 it became apparent that CUC had engaged in accounting fraud for years before the merger. Cendant's stock lost 80% of its value over the next six months in the ensuing widely discussed accounting scandal. The company sold its consumer software operations, Sierra On-line (which included Blizzard) to French publisher Havas in 1998, the same year Havas was purchased by Vivendi. Blizzard, at this point numbering about 200 employees, became part of the Vivendi Games group of Vivendi. In 1996, Blizzard Entertainment acquired Condor Games of San Mateo, California, which had been working on the action role-playing game (ARPG) "Diablo" for Blizzard at the time, and was led by David Brevik and brothers Max and Erick Schaefer. Condor was renamed Blizzard North, with Blizzard's existing Irvine studios colloquially referred as Blizzard South. "Diablo" was released at the very start of 1997 alongside Battle.net, a matchmaking service for the game. Blizzard North developed the sequel "Diablo II" (2000), and its expansion pack "" (2001). Following the success of ', Blizzard began development on a science fiction-themed RTS, "StarCraft", and released the title in March 1998. The title was the top-selling PC game for the year, and led to further growth of the Battle.net service and the use of the game for esports. Around 2000, Blizzard engaged with Nihilistic Software to work on a version of "StarCraft" for home consoles for Blizzard. Nihilisitic was co-founded by Robert Huebner, who had worked on "StarCraft" and other games while a Blizzard employee before leaving to found the studio. The game, ', was a stealth-oriented game compared to the RTS features of "StarCraft", and was a major feature of the 2002 Tokyo Game Show. However, over the next few years, the game entered development hell with conflicts between Nihilisitic and Blizzard on its direction. Blizzard ordered Nihilistic to stop work on "StarCraft: Ghost" in July 2004, and instead brought on Swingin' Ape Studios, a third-party studio that had just successfully released "" in 2003, to reboot the development of "Ghost". Blizzard fully acquired Swingin' Ape Studios in May 2005 to continue on "Ghost". However, while the game was scheduled to be released in 2005, it was targeted at the consoles of the sixth generation, such as the PlayStation 2 and original Xbox, while the industry was transitioning to the seventh generation. Blizzard decided to cancel "Ghost" rather than extend its development period to work on the newer consoles. Blizzard started to work on a sequel to the "Warcraft II" in early 1998, which was announced as a "role-playing strategy" game. "", the third title set in the "Warcraft" fictional universe, was released in July 2002. "Warcraft III" has inspired many future games, having the influence on real-time strategy and multiplayer online battle arena genre. Many of the characters, locations and concepts introduced in "Warcraft III" and went on to play major roles in numerous future Blizzard's titles. In 2002, Blizzard was able to reacquire rights for three of its earlier Silicon & Synapse titles, "The Lost Vikings", "Rock n' Roll Racing" and "Blackthorne", from Interplay Entertainment and re-release them for the Game Boy Advance handheld console. Around 2003, Blizzard North was working on "Diablo III" as well as planned science-fiction-based version dubbed "Starblo". Amid rumors that Vivendi was looking to sell its gaming division around 2003, Blizzard North's leadership, consisting of Brevik, the Schaefers, and Bill Roper, asked Blizzard to provide their studio protections from the potential sale, or else they would resign. After several rounds of tense communications, the four gave their resignations to Blizzard's management on June 30, 2003. As part of this, a significant portion of Blizzard North's staff were laid off, additional work on "Starblo" was terminated and the remaining team focused "Diablo III". Blizzard's management made the decision August 2005 to consolidate Blizzard North into Blizzard Entertainment, relocating staff to the main Blizzard offices in Irvine. In 2004, Blizzard opened European offices in the Paris suburb of Vélizy, Yvelines, France. Blizzard began work on "World of Warcraft" near the end of 1999, a massively multiplayer online role-playing game (MMORPG) based on the "Warcraft" franchise, with gameplay inspired by "EverQuest". The game was publicly announced in September 2001. The excitement by the media for "World of Warcraft" led to significant growth with Team 2 from forty to the hundreds, as well as a large amount of crunch development to complete the game. In January 2004, Adham announced he was leaving the company from being burnt out over his work on "World of Warcraft", transferring management to Morhaime. "World of Warcraft" was released on November 23, 2004, in North America, and on February 11, 2005, in Europe. By December 2004, the game was the fastest-selling PC game in the United States, and by March 2005, had reached 1.5 million subscribers worldwide. Blizzard partnered with Chinese publisher The9 to publish and distribute "World of Warcraft" in China, as foreign companies could not directly publish into the country themselves. "World of Warcraft" launched in China in June 2005. By the end of 2007, "World of Warcraft" was considered a global phenomenon, having reached over 9 million subscribers and exceeded in revenue since its release. In April 2008, "World of Warcraft" was estimated to hold 62 percent of the MMORPG subscription market. With the success of "World of Warcraft", Blizzard Entertainment organized the first BlizzCon fan convention in October 2005 held at the Anaheim Convention Center. The inaugural event drew about 6,000 people and became an annual event which Blizzard uses to announce new games, expansions, and content for its properties. Blizzard's staff quadrupled from around 400 employees in 2004 to 1600 by 2006 to provide more resources to "World of Warcraft" and its various expansions. To deal with its growing staff, Blizzard moved their headquarters from the UCI Research Park campus to a newly constructed 240,000-square foot campus in Irvine that was formerly occupied by Broadcom and before that by AST Research; the former Research Park site was taken over by Linksys. Blizzard's new base was completed by March 2008; the city named the primary street on this campus as 1 Blizzard Way to honor the company. The campus includes a twelve-foot tall bronze statue of a "Warcraft" orc riding a wolf, with plaques surrounding it representing the eight company values by that point, "Gameplay First", "Commit to Quality", "Play Nice; Play Fair", "Embrace Your Inner Geek", "Learn & Grow", "Every Voice Matters", "Think Globally", and "Lead Responsibly". Vivendi merger with Activision and continued growth (2008–2017). Up through 2006, Bobby Kotick, the CEO of Activision, had been working to rebound the company from near-bankruptcy, and had established a number of new studios. However, Activision lacked anything in the MMO market. Kotick saw that "World of Warcraft" was bringing in over a year in subscription fees, and began approaching Vivendi's CEO Jean-Bernard Lévy about potential acquisition of their struggling Vivendi Games division, which included Blizzard Entertainment. Lévy was open to a merger, but would only allow it if he controlled the majority of the combined company, knowing the value of "World of Warcraft" to Kotick. Among those Kotick spoke to for advice included Blizzard's Morhaime, who told Kotick that they had begun establishing lucrative in-roads into the Chinese market. Kotick accepted Lévy's deal, with the deal approved by shareholders in December 2007. By July 2008, the merger was complete, with Vivendi Games effectively dissolved except for Blizzard Entertainment, and the new company was named Activision Blizzard. Blizzard established a distribution agreement with the Chinese company NetEase in August 2008 to publish Blizzard's games in China. The deal focused on ' which was gaining popularity as an esport within southeast Asia, as well as for other Blizzard games with the exception of "World of Warcraft", still being handled by The9. The two companies established the Shanghai EaseNet Network Technology for managing the games within China. Blizzard and The9 prepared to launch the "World of Warcraft" expansion ', but the expansion came under scrutiny by China's content regulation board, the General Administration of Press and Publication, which rejected publication of it within China in March 2009, even with preliminary modifications made by The9 to clear it. Rumors of Blizzard's dissatisfaction with The9 from this and other previous complications with "World of Warcraft" came to a head when, in April 2009, Blizzard announced it was terminating its contract with The9, and transferred operation of "World of Warcraft" in China to NetEase. They released an improved version of Battle.net (Battle.net 2.0) in March 2009 which included improved matchmaking, storefront features, and better support for all of Blizzard's existing titles particularly "World of Warcraft". Having peaked at 12 million monthly subscriptions in 2010, "World of Warcraft" subscriptions sunk to 6.8 million in 2014, the lowest number since the end of 2006, prior to "" expansion. However, "World of Warcraft" is still the world's most-subscribed MMORPG, and holds the Guinness World Record for the most popular MMORPG by subscribers. In 2008, Blizzard was honored at the 59th Annual Technology & Engineering Emmy Awards for the creation of "World of Warcraft". Mike Morhaime accepted the award. Following the merger, Blizzard found it was relying on its well-established properties, but at the same time, the industry was experiencing a shift towards indie games. Blizzard established a few small teams within the company to work on developing new concepts based on the indie development approach that it could potentially use. One of these teams quickly came onto the idea of a collectible card game based on the "Warcraft" narrative universe, which ultimately became "Hearthstone", released as a free-to-play title in March 2014. "Hearthstone" reached over 25 million players by the end of 2014, and exceeded 100 million players by 2018. Another small internal team began work around 2008 on a new intellectual property known as "Titan", a more contemporary or near-future MMORPG that would have co-existed alongside "World of Warcraft". The project gained more visibility in 2010 as a result of some information leaks. Blizzard continued to speak on "Titan"s development over the next few years, with over 100 people within Blizzard working on the project. However, "Titan"s development was troubled, and, internally, in May 2013, Blizzard cancelled the project (publicly reporting this in 2014), and reassigned most of the staff but left about 40 people, led by Jeff Kaplan, to either come up with a fresh idea within a few weeks or have their team reassigned to Blizzard's other departments. The small team came upon the idea of a team-based multiplayer shooter game, reusing many of the assets from "Titan" but set in a new near-future narrative. The new project was greenlit by Blizzard and became known as "Overwatch", which was released in May 2016. "Overwatch" became the fourth main intellectual property of Blizzard, following "Warcraft", "StarCraft", and "Diablo". In addition to "Hearthstone" and "Overwatch", Blizzard Entertainment continued to produce sequels and expansions to its established properties during this period, including ' (2010) and "Diablo III" (2012). Their major crossover title, "Heroes of the Storm", was released as a MOBA game in 2015. The game featured various characters from Blizzard's franchises as playable heroes, as well as different battlegrounds based on "Warcraft", "Diablo", "StarCraft", and "Overwatch" universes. In the late 2010s, Blizzard released ' (2017) and "" (2020)"," remastered versions of the original "StarCraft" and "Warcraft III," respectively"." The May 2016 release of "Overwatch" was highly successful, and was the highest-selling game on PC for 2016. Several traditional esport events had been established within the year of "Overwatch" release, such as the Overwatch World Cup, but Blizzard continued to expand this and announced the first esports professional league, the Overwatch League at the 2016 BlizzCon event. The company purchased a studio at The Burbank Studios in Burbank, California, that it converted into a dedicated esports venue, Blizzard Arena, to be used for the Overwatch League and other events. The inaugural season of the Overwatch League launched on January 10, 2018, with 12 global teams playing. By the second season in 2019 it had expanded the League to 20 teams, and with its third season in 2020, it will have these teams traveling across the globe in a transitional home/away-style format. In 2012, Blizzard Entertainment had 4,700 employees, with offices across 11 cities including Austin, Texas, and countries around the globe. , the company's headquarters in Irvine, California had 2,622 employees. Change of leadership (2018–2022). By 2018, a rift had developed between Kotick and Morhaime on how Blizzard should continue developing its games, with Morhaime wanting to allow the developers the freedom to experiment while Kotick was focused on generating profit. Morhaime had considered resigning in 2017 but Kotick convinced him to stay on. Morhaime announced his plans to step down as the company president and CEO On October 3, 2018, while remaining an advisor to the company. Morhaime stated publicly that he felt it was time for someone else to lead Blizzard, but those close to him said he had become tired of the conflicts with Kotick. Morhaime formally left on April 7, 2019, and was replaced by J. Allen Brack, the executive producer on "World of Warcraft". In February 2019, Kotick announced a company-wide layoff of 8% of Activision Blizzard staff, around 800 total positions, due to lower revenues in 2018; this included a significant portion of Blizzard Entertainment, which had been since as having bloated head count over the years. Blizzard was planning for the announcement of "Diablo IV" and "Overwatch 2" at the 2019 Blizzcon, and to keep the company focused, two other projects, codenamed "Ares" and "Orion", were cancelled. Frank Pearce announced he would be stepping down as Blizzard's Chief Development Officer on July 19, 2019, though will remain in an advisory role similar to Morhaime. Michael Chu, lead writer on many of Blizzard's franchises including "Diablo", "Warcraft", and "Overwatch", announced he was leaving the company after 20 years in March 2020. On January 22, 2021, Activision transferred Vicarious Visions over to Blizzard Entertainment, stating that the Vicarious Visions team had better opportunity for long-term support for Blizzard. Vicarious had been working with Blizzard for about two years prior to this announcement on the planned remaster of "Diablo II", "", and according to Brack, it made sense to incorporate Vicarious into Blizzard for ongoing support of the game and for other "Diablo" games including "Diablo IV". Vicarious was completely merged into Blizzard by April 12, 2022, thereby being renamed Blizzard Albany. In celebration of the company's 30th anniversary, Blizzard Entertainment released a compilation called "Blizzard Arcade Collection" in February 2021, for various video game platforms. The collection includes their three classic video games: "The Lost Vikings", "Rock n' Roll Racing," and "Blackthorne," each of which containing additional upgrades and numerous modern features. Activision Blizzard was the subject of a lawsuit from the California Department of Fair Employment and Housing in July 2021, asserting that for several years the management within Blizzard as well as Activision promoted a "frat boy" atmosphere that allowed and encouraged sexual misconduct towards female employees and discrimination in hiring practices. The lawsuit drew a large response from employees and groups outside of Activision Blizzard. In the wake of these events, Brack, one of the few individuals directly named in the suit, announced he was leaving Blizzard to "pursue new opportunities", and will be replaced by co-leads Jen Oneal, the lead of Vicarious Visions and the first woman in a leadership role for the company, and Mike Ybarra, a Blizzard executive vice president. Oneal announced in November 2021 that she would be leaving the company by the end of 2021, leaving Ybarra as the sole leader of Blizzard. As a result of the California lawsuit and of delays and release issues with their more recent games, Activision Blizzard's stock faced severe pressure. Subsequently, Microsoft seized the opportunity to become one of the largest video game companies in the world and announced its intent to acquire Activision Blizzard and its subsidiaries, including Blizzard, for in January 2022. This exchange marks the largest acquisition in tech history, surpassing the $67 billion Dell-EMC merger from 2016. The deal closed on October 13, 2023, and Activision Blizzard moved into the Microsoft Gaming division. Blizzard acquired Proletariat, the developers of "Spellbreak", in June 2022 as to help support "World of Warcraft". The 100-employee studio remained in Boston but will shutter "Spellbreak" as they move onto "Warcraft". Challenges with NetEase and Microsoft acquisition (2022–present). Ahead of their license renewal in January 2023, Blizzard (via Activision Blizzard) and NetEase stated in November 2022 that they had been unable to come to an agreement on the renewal terms for their license, and thus most Blizzard games will cease operations in China in January 2023 until the situation can be resolved. According to a report by "The New York Times", several factors influenced Activision Blizzard's decision to terminate the agreement, which included stronger demands made by the Chinese government to know of Activision Blizzard's internal business matters, NetEase's desire to license the games directly rather than run the license through a joint venture, and Activision Blizzard's concerns that NetEase was trying to start their own ventures, including the payment towards Bungie in 2018. NetEase was further concerned about the impact of the pending acquisition of Activision Blizzard by Microsoft. Activision Blizzard stated they were looking to other Chinese firms as replacements for NetEase as to restore their games in China. By April 2024, Blizzard, with Microsoft's help, and NetEase had agreed to new publishing terms, with plans to bring back Blizzard's games to China by mid-2024, maintaining all prior game ownership from the original publishing deal. Under this new deal, NetEase also will be able to bring games to the Xbox platform. Following completion of the acquisition, Microsoft announced it was laying off 1,900 staff from Microsoft Gaming on January 25, 2024. Alongside this, Blizzard President Mike Ybarra and Chief Design Officer Allen Adham announced they would be leaving the company. Further, the planned survival game from Blizzard was canceled. On January 29, 2024, Johanna Faries, the former general manager of the "Call of Duty" series, was named Blizzard Entertainment's new president, taking office on February 5. Following the unionization success of Raven Software's Game Workers Alliance (GWA) union for quality assurance (QA) testers, the 20-member QA team of Blizzard Albany announced a unionization drive in July 2022 as GWA Albany. The vote passed (14–0). On July 24, 2024, 500 artists, designers, engineers, producers, and quality assurance testers who work on "World of Warcraft" voted to unionize under the Communications Workers of America. The same day, 60 QA testers at Blizzard's Austin office, who work on various games including "Diablo IV" and "Hearthstone", also voted to unionize and formed the union "Texas Blizzard QA United-CWA." The following year, in May 2025, the members of Blizzard Team 4, who work on the game "Overwatch 2", also unionized with the Communications Workers of America, with nearly 200 game developers forming the wall-to-wall union "Overwatch Gamemakers Guild-CWA." Games. Blizzard Entertainment has developed 19 games since the inception of the company in 1991. Main franchises. The majority of the games Blizzard published are in the "Warcraft", "Diablo", and "StarCraft" series. Since the release of ' (1994), "Diablo" (1997), and "StarCraft" (1998), the focus has been almost exclusively on those three franchises. "Overwatch" (2016) became an exception years later, bringing the number of main franchises to four. Each franchise is supported by other media based around its intellectual property such as novels, collectible card games, comics and video shorts. Blizzard announced in 2006 that they would be producing a "Warcraft" live-action film. The movie was directed by Duncan Jones, financed and produced by Legendary Pictures, Atlas Entertainment, and others, and distributed by Universal Pictures. It was released in June 2016. On October 4, 2022, "Overwatch" servers were officially shut off at the same time "Overwatch 2s went up. Spin-offs. Blizzard has released two spin-offs to the main franchises: "Hearthstone" (2014), which is set in the existing "Warcraft" lore, and "Heroes of the Storm" (2015), which features playable characters from all four of Blizzard's franchises. Remasters. In 2015, Blizzard Entertainment formed "Classic Games division", a team focused on updating and remastering some of their older titles, with an initially announced focus on ' (2017), ' (2020)"," and "" (2021)"." Re-released games. In February 2021, Blizzard Entertainment released a compilation called "Blizzard Arcade Collection" for Microsoft Windows, Xbox One, PlayStation 4, and Nintendo Switch. The collection includes five Blizzard's classic video games: "The Lost Vikings", "Rock n' Roll Racing", "Blackthorne", "The Lost Vikings 2" and "RPM Racing", with the last two games added in April 2021. Some of the modern features include 16:9 resolution, 4-player split-screen, rewinding and saving of game progress, watching replays, and adding graphic filters to change the look of player's game. Additionally, it contains upgrades for each game such as enhanced local multiplayer for "The Lost Vikings", new songs and artist performances for "Rock n' Roll Racing", as well as a new level map for "Blackthorne." A digital museum, which is included in the collection, features game art, unused content, and interviews. Unreleased and future games. Notable unreleased titles include ', an adventure game which was canceled on May 22, 1998; "Shattered Nations", a turn-based strategy game cancelled around 1996; and ', an action game aimed for release on consoles, co-developed with Nihilistic Software, which was "postponed indefinitely" on March 24, 2006, after being in development hell for much of its lifespan. Work on a project called "Nomad" started around 1998 after the release of "Starcraft", with development led by Duane Stinnett. "Nomad" was inspired by the tabletop role playing game "Necromunda" that was played in a post-apocalyptic setting. The project had vague goals, and around that time, many of the staff of Blizzard began playing MMORPGs "EverQuest" and "Ultima Online". "Nomad" was cancelled in 1999 as Blizzard shifted to making their own MMORPG, "World of Warcraft". In the wake of the 2018 layoffs, two projects were cancelled: One was codenamed "Orion", an asynchronous card game for mobile devices designed by "Hearthstone" developers. While the game was considered fun to play when players were engaged in real time, the asynchronous aspect diluted the enjoyment of the. The second, codenamed "Ares", was a first-person shooter within the "Starcraft" universe inspired by Electronic Arts' "Battlefield" series that had been in development for three years. After seven years of development, Blizzard revealed the cancellation of an unannounced MMO codenamed "Titan" on September 23, 2014, though "Overwatch" was created from its assets. The company also has a history of declining to set release dates, choosing to instead take as much time as needed, generally saying a given product is "done when it's done." "Pax Imperia II" was originally announced as a title to be published by Blizzard. Blizzard eventually dropped "Pax Imperia II", though, when it decided it might be in conflict with their other space strategy project, which became known as "StarCraft". THQ eventually contracted with Heliotrope and released the game in 1997 as "". The company announced in January 2022 that it was near release of another new intellectual property, named "Odyssey" according to "Bloomberg News", a survival game that had been at work at the studio for nearly six years before its cancellation in 2024. "Bloomberg" stated that the game's origins came from "World of Warcraft" developer Craig Amai, and was originally prototyped using the Unreal Engine, which Blizzard licensed from Epic Games. When the game was revealed in 2022, about 100 employees were working on it, but around the same time, there was effort to switch from Unreal to Synapse, Blizzard's engine used for mobile games, though artists continued to develop assets in Unreal. Near when Microsoft completed its acquisition of Activision Blizzard, there was an internal belief that they would be able to bring on more developers to complete the transition to Synapse and have the game ready for a 2026 release, but with the culling of 1,900 staff from Microsoft Gaming in January 2024, the game's development was cancelled. Ports. The company, known at the time as the Silicon & Synapse, initially concentrated on porting other studios' games to computer platforms, developing 8 ports between 1992 and 1993. Company structure. As with most studios with multiple franchises, Blizzard Entertainment has organized different departments to oversee these franchises. Formally, since around the time of "World of Warcraft" in 2004, these have been denoted through simply numerical designations. The original three teams were: Since 2004, two new teams were created: Blizzard has used an informal motto, "it'll be ready when it's ready" to describe the release schedule for its games. The concept of accepting delays in software releases originally came about when the company opted to push back the release of "Diablo" to assure a quality product at release. By the time of "World of Warcraft", employees began using the phrase in response to eager fans looking for a release date. Technology. Battle.net 2.0. Blizzard Entertainment released its revamped Battle.net service in 2009. The platform provides online gaming, digital distribution, digital rights management, and social networking service. Battle.net allows people who have purchased Blizzard products to download digital copies of games they have purchased, without needing any physical media. On November 11, 2009, Blizzard required all "World of Warcraft" accounts to switch over to Battle.net accounts. This transition means that all current Blizzard titles can be accessed, downloaded, and played with a singular Battle.net login. Battle.net 2.0 is the platform for matchmaking service for Blizzard games, which offers players a host of additional features. Players are able to track their friend's achievements, view match history, avatars, etc. Players are able to unlock a wide range of achievements for Blizzard games. The service provides the user with community features such as friends lists and groups, and allows players to chat simultaneously with players from other Blizzard games using VoIP and instant messaging. For example, players no longer need to create multiple user names or accounts for most Blizzard products. To enable cross-game communication, players need to become either Battletag or Real ID friends. Warden client. Blizzard Entertainment has made use of a special form of software known as the 'Warden Client'. The Warden client is known to be used with Blizzard's online games such as "Diablo" and "World of Warcraft", and the Terms of Service contain a clause consenting to the Warden software's RAM scans while a Blizzard game is running. The Warden client scans a small portion of the code segment of running processes in order to determine whether any third-party programs are running. The goal of this is to detect and address players who may be attempting to run unsigned code or third party programs in the game. This determination of third party programs is made by hashing the scanned strings and comparing the hashed value to a list of hashes assumed to correspond to banned third party programs. The Warden's reliability in correctly discerning legitimate versus illegitimate actions was called into question when a large-scale incident happened. This incident banned many Linux users after an update to Warden caused it to incorrectly detect Cedega as a cheat program. Blizzard issued a statement claiming they had correctly identified and restored all accounts and credited them with 20 days' play. Warden scans all processes running on a computer, not just the game, and could possibly run across what would be considered private information and other personally identifiable information. It is because of these peripheral scans that Warden has been accused of being spyware and has run afoul of controversy among privacy advocates. Controversies and legal disputes. Blizzard Entertainment, Inc. v. Valve Corporation. Shortly after Valve filed its trademark for "DotA" to secure the franchising rights for "Dota 2", DotA-Allstars, LLC, run by former contributors to the game's predecessor, "Defense of the Ancients", filed an opposing trademark in August 2010. DotA All-Stars, LLC was sold to Blizzard Entertainment in 2011. After the opposition was over-ruled in Valve's favor, Blizzard filed an opposition against Valve in November 2011, citing their license agreement with developers, as well as their ownership of DotA-Allstars, LLC. An agreement was reached between Blizzard and Valve in May 2012 giving Valve undisputed commercial rights to the "DotA" trademark, notably for Dota 2, while retaining rights for Blizzard to use the name in a noncommercial manner for its community "with regard to player-created maps for Warcraft III and StarCraft II". As part of the agreement Blizzard also renamed a custom map they had originally developed for "" from "Blizzard DOTA" to "Blizzard All-Stars" which would eventually become the stand-alone game, "Heroes of the Storm". "California Department of Fair Employment and Housing v. Activision Blizzard". Following a two-year investigation, the California Department of Fair Employment and Housing (DFEH) filed a lawsuit against Activision Blizzard in July 2021 for gender-based discrimination and sexual harassment, principally within the Blizzard Entertainment workplace. The DFEH alleges that female employees were subjected to constant sexual harassment, unequal pay, retaliation, as well as discrimination based on pregnancy. The suit also described a "pervasive frat boy workplace culture" at Blizzard that included objectification of women's bodies and jokes about rape. Activision Blizzard's statement described the suit as meritless, contending that action had been taken in any instances of misconduct. The company also objected to the DFEH not approaching them prior to filing. The lawsuit prompted an employee walkout, as well as leading J Allen Brack, and head of human resources, Jesse Meschuk, to step down. Amidst the lawsuit, Morhaime, Brack's predecessor, posted a statement to Twitter writing that he was “ashamed.” Because of these allegations, Blizzard changed names that referenced employees in multiple of its franchises, including "Overwatch" and "World of Warcraft". Founder Electronics infringement lawsuit. On August 14, 2007, Beijing University Founder Electronics Co., Ltd. sued Blizzard Entertainment Limited for copyright infringement claiming 100 million yuan in damages. The lawsuit alleged the Chinese edition of "World of Warcraft" reproduced a number of Chinese typefaces made by Founder Electronics without permission. "FreeCraft". On June 20, 2003, Blizzard issued a cease and desist letter to the developers of an open-source clone of the Warcraft engine called "FreeCraft", claiming trademark infringement. This hobby project had the same gameplay and characters as "Warcraft II", but came with different graphics and music. As well as a similar name, "FreeCraft" enabled players to use "Warcraft II" graphics, provided they had the "Warcraft II" CD. The programmers of the clone shut down their site without challenge. Soon after that the developers regrouped to continue the work by the name of "Stratagus". "Hearthstone" ban and Hong Kong protests. During an October 2019 "Hearthstone Grandmasters" streaming event in Taiwan, one player Ng Wai Chung, going by his online alias "Blitzchung" used an interview period to show support for the protestors in the 2019–20 Hong Kong protests. Shortly afterwards, on October 7, 2019, Blitzchung was disqualified from the current tournament and forfeited his winnings to date, and banned for a one-year period. The two shoutcasters engaged in the interview were also penalized with similar bans. Blizzard justified the ban as from its "Grandmasters" tournament rules that prevents players from anything that "brings [themselves] into public disrepute, offends a portion or group of the public, or otherwise damages [Blizzard's] image". Blizzard's response led to several protests from current "Hearthstone" players, other video game players, and criticism from Blizzard's employees, fearing that Blizzard was giving into the censorship of the Chinese government. Protests were held, including through the 2019 BlizzCon in early November, to urge Blizzard to reverse their bans. The situation also drew the attention of several U.S. lawmakers, fearing that Blizzard, as a U.S. company, was letting China dictate how it handled speech and also urged the bans to be reversed. Blizzard CEO J. Allen Brack wrote an open letter on October 11, 2019, apologizing for the way Blizzard handled the situation, and reduced the bans for both Blitzchung and the casters to six months. Brack reiterated that while they support free speech and their decision was in no way tied to the Chinese government, they want players and casters to avoid speaking beyond the tournament and the games in such interviews. King's "Diversity Tool" controversy. On May 12, 2022, Blizzard Entertainment released a blog post about the Diversity Space Tool, developed by a team at King – a mobile business unit at Activision Blizzard – alongside the MIT Game Lab. Jacqueline Chomatas, King's globalization project manager, described the tool as a "measurement device" to analyze how diverse the characters are "when compared to the 'norm'". The post showed example images of the tool being used on "Overwatch"'s cast, with graphs showing breakdowns of the character attributes, and stated that "The Overwatch 2 team at Blizzard has also had a chance to experiment with the tool, with equally enthusiastic first impressions." Blizzard shared the intent to release the tool during the summer and fall of 2022, with the goal of "making the tool available to the industry as a whole". The tool received heavy backlash online. Many people asked why Blizzard would create the tool instead of hiring diverse teams, and raised questions regarding the tool's rating scale. The blog post originally suggested that the tool was used in an active development, mainly for "Overwatch", which led some Blizzard employees working on the game to publicly deny the tool was used in "Overwatch" development and to criticize the tool further. On May 13, 2022, the blog post was edited to remove the example images of the tool and any mention of "Overwatch". Later, the post was deleted altogether. "MDY Industries, LLC v. Blizzard Entertainment, Inc.". On July 14, 2008, the United States District Court for the District of Arizona ruled on the case "MDY Industries, LLC v. Blizzard Entertainment, Inc.". The Court found that MDY was liable for copyright infringement since users of its Glider bot program were breaking the End User License Agreement and Terms of Use for "World of Warcraft". MDY Industries appealed the judgment of the district court, and a judgment was delivered by the Ninth Circuit Court of Appeals on December 14, 2010, in which the summary judgment against MDY for contributory copyright infringement was reversed. Nevertheless, they ruled that the bot violated the DMCA and the case was sent back to the district court for review in light of this decision. "MDY v. Blizzard"s decision did affirm a prior Ninth Circuit ruling in "Vernor v. Autodesk, Inc." that software licenses, such as the one used by Blizzard for "WoW", were enforceable and enshrined the principle that video games could be sold as licenses to players rather than purchased. This ruling, though limited to the states of the Ninth Circuit, has been used by the industry to continue to sell games as licenses to users. Privacy controversy and Real ID. On July 6, 2010, Blizzard Entertainment announced that they were changing the way their forums worked to require that users identify themselves with their real name. The reaction from the community was overwhelmingly negative with multiple game magazines calling the change "foolhardy" and an "epic fail". It resulted in a significant user response on the Blizzard forums, including one thread on the issue reaching over 11,000 replies. This included personal details of a Blizzard employee who gave his real name "to show it wasn't a big deal". Shortly after revealing his real name, forum users posted personal information including his phone number, picture, age, home address, family members, and favorite TV shows and films. Some technology media outlets suggested that displaying real names through Real ID is a good idea and would benefit both Battle.net and the Blizzard community. But others were worried that Blizzard was opening their fans up to real-life dangers such as stalking, harassment, and employment issues, since a simple Internet search by someone's employer can reveal their online activities. Blizzard initially responded to some of the concerns by saying that the changes would not be retroactive to previous posts, that parents could set up the system so that minors cannot post, and that posting to the forums is optional. However, due to the significant negative response, Blizzard President Michael Morhaime issued a statement rescinding the plan to use real names on Blizzard's forums for the time being. The idea behind this plan was to allow players who had a relationship outside of the games to find each other more easily across all the Blizzard game titles. "StarCraft" privacy and other lawsuits. In 1998, Donald P. Driscoll, an Albany, California attorney, filed a suit on behalf of Intervention, Inc., a California consumer group, against Blizzard Entertainment for "unlawful business practices" for the action of collecting data from a user's computer without their permission. On May 19, 2014, Blizzard Entertainment filed a lawsuit in federal court in California, alleging that the unidentified programmers were involved in creation of software that hacks "StarCraft II". Most of the alleged charges are related to copyright infringement. Back in May 2010, MBCPlus Media, which operates the network MBCGame (Korean television stations that are broadcasting tournaments built around "StarCraft"), was sued by Blizzard for broadcasting StarCraft tournaments without the company's consent, insisting that "StarCraft" is not a public domain offering, as Blizzard has invested significant money and resources to create the "StarCraft" game. "World of Warcraft" private server complications. On December 5, 2008, Blizzard Entertainment issued a cease and desist letter to many administrators of high-population "World of Warcraft" private servers (essentially slightly altered hosting servers of the actual "World of Warcraft" game, that players do not have to pay for). Blizzard used the Digital Millennium Copyright Act to influence many private servers to fully shut down and cease to exist. Related companies. Over the years, some former Blizzard Entertainment employees have moved on and established gaming companies of their own. Several of these occurred following the merger between Activision Holdings and Blizzard's parent company at the time, Vivendi Games in 2008, and more recently as Activision Blizzard has directed Blizzard away from properties like "Warcraft" and "StarCraft" that are not seen as financial boons to the larger company. These employees left to form their smaller studios to give themselves the creative freedom that they were lacking at Blizzard. Collectively these studios are known as "Blizzard 2.0".
4878
7903804
https://en.wikipedia.org/wiki?curid=4878
Robert Bellarmine
Robert Bellarmine (; ; 4 October 1542 – 17 September 1621) was an Italian Jesuit and a cardinal of the Catholic Church. He was canonized a saint in 1930 and named Doctor of the Church, one of only 37. He was one of the most important figures in the Counter-Reformation. Bellarmine was a professor of theology and later rector of the Roman College, and in 1602 became Archbishop of Capua. He supported the reform decrees of the Council of Trent. He is also widely remembered for his role in the Giordano Bruno affair, the Galileo affair, and the trial of Friar Fulgenzio Manfredi. Early life. Robert Bellarmine was born in Montepulciano, the son of noble, albeit impoverished, parents, Vincenzo Bellarmino and his wife Cinzia Cervini, who was the sister of Pope Marcellus II. As a boy he knew Virgil by heart and composed a number of poems in Italian and Latin. One of his hymns, on Mary Magdalene, is included in the Roman Breviary. Bellarmine entered the Roman Jesuit novitiate in 1560, remaining in Rome for three years. He was then sent to a Jesuit house at Mondovì, in Piedmont, where he learned Greek. While at Mondovì, he came to the attention of Francesco Adorno, the local Jesuit provincial superior, who sent him to the University of Padua. Career. Bellarmine's systematic studies of theology began at Padua in 1567 and 1568, where his teachers were adherents of Thomism. In 1569, he was sent to finish his studies at the University of Leuven in Brabant. There he was ordained and obtained a reputation both as a professor and as a preacher. He was the first Jesuit to teach at the university, where the subject of his course was the "Summa Theologica" of Thomas Aquinas. He was involved in a controversy with Michael Baius on the subject of Grace and free will, and wrote a Hebrew grammar. His residency in Leuven lasted seven years. In poor health, in 1576 he made a journey to Italy. Here he remained, commissioned by Pope Gregory XIII to lecture on polemical theology in the new Roman College, now known as the Pontifical Gregorian University. Later, he would promote the cause of the beatification of Aloysius Gonzaga, who had been a student at the college during Bellarmine's tenure. His lectures were published under the title "De Controversias" in four large volumes. New duties after 1589. Until 1589, Bellarmine was occupied as professor of theology. After the murder in that year of Henry III of France, Pope Sixtus V sent Enrico Caetani as legate to Paris to negotiate with the Catholic League of France, and chose Bellarmine to accompany him as theologian. He was in the city during its siege by Henry of Navarre. Upon the death of Pope Sixtus V in 1590, the Count of Olivares wrote to Philip II of Spain, "Bellarmine ... would not do for a Pope, for he is mindful only of the interests of the Church and is unresponsive to the reasons of princes." Pope Clement VIII, said of him, "the Church of God had not his equal in learning". Bellarmine was made rector of the Roman College in 1592, examiner of bishops in 1598, and cardinal in 1599. Immediately after his appointment as Cardinal, Pope Clement made him a Cardinal Inquisitor, in which capacity he served as one of the judges at the trial of Giordano Bruno, and concurred in the decision which condemned Bruno to be burned at the stake as a heretic. In 1602 he was made archbishop of Capua. He had written against bishops failing to reside in their dioceses. As bishop he put into effect the reforming decrees of the Council of Trent. He received some votes in the 1605 conclaves which elected Pope Leo XI, Pope Paul V, and in 1621 when Pope Gregory XV was elected, but his being a Jesuit counted against him in the judgement of many of the cardinals. Thomas Hobbes saw Bellarmine in Rome at a service on All Saints Day (1 November) 1614 and, exempting him alone from a general castigation of cardinals, described him as "a little lean old man" who lived "more retired". The Galileo case. In 1616, on the orders of Paul V, Bellarmine summoned Galileo, notified him of a forthcoming decree of the Congregation of the Index condemning the Copernican doctrine of the mobility of the Earth and the immobility of the Sun, and ordered him to abandon it. Galileo agreed to do so. When Galileo later complained of rumours to the effect that he had been forced to abjure and do penance, Bellarmine wrote out a certificate denying the rumours, stating that Galileo had merely been notified of the decree and informed that, as a consequence of it, the Copernican doctrine could not be "defended or held". Unlike the previously mentioned formal injunction (see earlier footnote), this certificate would have allowed Galileo to continue using and teaching the mathematical content of Copernicus's theory as a purely theoretical device for predicting the apparent motions of the planets. According to some of his letters, Cardinal Bellarmine believed that a demonstration for heliocentrism could not be found because it would contradict the unanimous consent of the Fathers' scriptural exegesis, to which the Council of Trent, in 1546, defined all Catholics must adhere. In other passages, Bellarmine argued that he did not support the heliocentric model for the lack of evidence of the time ("I will not believe that there is such a demonstration, until it is shown to me"). Bellarmine wrote to heliocentrist Paolo Antonio Foscarini in 1615: and In 1633, nearly twelve years after Bellarmine's death, Galileo was again called before the Inquisition in this matter. Galileo produced Bellarmine's certificate for his defense at the trial. According to Pierre Duhem and Karl Popper "in one respect, at least, Bellarmine had shown himself a better scientist than Galileo by disallowing the possibility of a "strict proof" of the earth's motion, on the grounds that an astronomical theory merely "saves the appearances" without necessarily revealing what "really happens". Philosopher of science Thomas Kuhn, in his book, "The Copernican Revolution", after commenting on Cesare Cremonini, who refused to look through Galileo's telescope, wrote: Death. Robert Bellarmine retired to Sant'Andrea degli Scozzesi, the Jesuit college of Saint Andrew in Rome. He died on 17 September 1621, aged 78. He was buried in the Church of St. Ignatius in Rome. Works. Bellarmine's books bear the stamp of their period; the effort for literary elegance (so-called "maraviglia") had given place to a desire to pile up as much material as possible, to embrace the whole field of human knowledge, and incorporate it into theology. His controversial works provoked many replies, and were studied for some decades after his death. At Leuven he made extensive studies in the Church Fathers and scholastic theologians, which gave him the material for his book "De scriptoribus ecclesiasticis" (Rome, 1613). It was later revised and enlarged by Sirmond, Labbeus, and Casimir Oudin. Bellarmine wrote the preface to the new Sixto-Clementine Vulgate. Bellarmine also prepared for posterity his own commentary on each of the Psalms. An English translation from the Latin was published in 1866. Dogmatics. From his research grew "Disputationes de controversiis christianae fidei" (also called "Controversiae"), first published at Ingolstadt in 1581–1593. This major work was the earliest attempt to systematize the various religious disputes between Catholics and Protestants. Bellarmine reviewed the issues and devoted eleven years to it while at the Roman College. In August 1590, Pope Sixtus V decided to place the first volume of the "Disputationes" on the Index because Bellarmine argued in it that the Pope is not the temporal ruler of the whole world and that temporal rulers do not derive their authority to rule from God but from the consent of the governed. However, Sixtus died before the revised Index was published, and the next Pope, Urban VII, removed the book from the Index during his brief twelve-day reign. In 1597–98, he published a catechism in two versions ( and ) which was approved by Pope Clement VIII and used as a standard catechetical text for centuries. Venetian Interdict. Under Pope Paul V (reigned 1605–1621), a major conflict arose between Venice and the Papacy. Paolo Sarpi, as spokesman for the Republic of Venice, protested against the papal interdict, and reasserted the principles of the Council of Constance and of the Council of Basel, denying the pope's authority in secular matters. Bellarmine wrote three rejoinders to the Venetian theologians, and may have warned Sarpi of an impending murderous attack, when in September 1607, an unfrocked friar and brigand by the name of Rotilio Orlandini planned to kill Sarpi for the sum of 8,000 crowns. Orlandini's plot was discovered, and when he and his accomplices crossed from Papal into Venetian territory they were arrested. Allegiance oath controversy and papal authority. Bellarmine also became involved in controversy with King James I of England. From a point of principle for English Catholics, this debate drew in figures from much of Western Europe. It raised the profile of both protagonists, King James as a champion of his own restricted Calvinist Protestantism, and Bellarmine for Tridentine Catholicism. Devotional works. During his retirement, he wrote several short books intended to help ordinary people in their spiritual life: "De ascensione mentis in Deum per scalas rerum creatorum opusculum" ("The Mind's Ascent to God by the Ladder of Created Things"; 1614) which was translated into English as "Jacob's Ladder" (1638) without acknowledgement by , "" (1619) (in Latin, English translation under this title by Edward Coffin), and "The Seven Words on the Cross". Canonization and final resting place. Robert Bellarmine was canonized by Pope Pius XI in 1930; the following year he was declared a Doctor of the Church. His remains, in a cardinal's red robes, are displayed behind glass under a side altar in the Church of Saint Ignatius, the chapel of the Roman College, next to the body of his student Aloysius Gonzaga, as he himself had wished. In the General Roman Calendar Saint Robert Bellarmine's feast day is on 17 September, the day of his death; but some continue to use pre-1969 calendars, in which for 37 years his feast day was on 13 May. The rank assigned to his feast has been "double" (1932–1959), "third-class feast" (1960–1968), and since the 1969 revision "memorial". External links. Works of Bellarmine Works about Bellarmine
4880
97840
https://en.wikipedia.org/wiki?curid=4880
Bildungsroman
In literary criticism, a bildungsroman () is a literary genre that focuses on the psychological and moral growth and change of the protagonist from childhood to adulthood (coming of age). The term comes from the German words ('formation' or 'education') and ('novel'). Origin. The term was coined in 1819 by philologist Johann Karl Simon Morgenstern in his university lectures, and was later famously reprised by Wilhelm Dilthey, who legitimized it in 1870 and popularized it in 1905. The genre is further characterized by a number of formal, topical, and thematic features. The term "coming-of-age novel" is sometimes used interchangeably with bildungsroman, but its use is usually wider and less technical. The birth of the bildungsroman is normally dated to the publication of "Wilhelm Meister's Apprenticeship" by Johann Wolfgang von Goethe in 1795–96, or, sometimes, to Christoph Martin Wieland's of 1767. Although the bildungsroman arose in Germany, it has had extensive influence first in Europe and later throughout the world. Thomas Carlyle's English translation of Goethe's novel (1824) and his own "Sartor Resartus" (1833–34), the first English bildungsroman, inspired many British novelists. In the 20th century, it spread to France and several other countries around the globe. Barbara Whitman noted that the "Iliad" might be the first bildungsroman. It is not just "the story of the Trojan War. The Trojan War is in effect the backdrop for the story of Achilles' development. At the beginning Achilles is still a rash youth, making rash decisions which cost dearly to himself and all around him. (...) The story reaches its conclusion when Achilles has reached maturity and allows King Priam to recover Hector's body". The genre translates fairly directly into the cinematic form, the coming-of-age film. Plot outline. A bildungsroman is a growing up or "coming of age" of a generally naive person who goes in search of answers to life's questions with the expectation that these will result in gaining experience of the world. The genre evolved from folklore tales of a dunce or youngest child going out in the world to seek their fortune. Usually in the beginning of the story, there is an emotional loss which makes the protagonist leave on their journey. In a bildungsroman, the goal is maturity, and the protagonist achieves it gradually and with difficulty. The genre often features a main conflict between the main character and society. Typically, the values of society are gradually accepted by the protagonist, and they are ultimately accepted into society—the protagonist's mistakes and disappointments are over. In some works, the protagonist is able to reach out and help others after having achieved maturity. Franco Moretti "argues that the main conflict in the bildungsroman is the myth of modernity with its overvaluation of youth and progress as it clashes with the static teleological vision of happiness and reconciliation found in the endings of Goethe's "Wilhelm Meister" and even Jane Austen's "Pride and Prejudice"". There are many variations and subgenres of bildungsroman that focus on the growth of an individual. An "Entwicklungsroman" ('development novel') is a story of general growth rather than self-cultivation. An "Erziehungsroman" ("education novel") focuses on training and formal schooling, while a "Künstlerroman" ("artist novel") is about the development of an artist and shows a growth of the self. Furthermore, some memoirs and published journals can be regarded as bildungsroman although claiming to be predominantly factual (e.g. "The Dharma Bums" by Jack Kerouac or" The Motorcycle Diaries" by Ernesto "Che" Guevara). The term is also more loosely used to describe coming-of-age films and related works in other genres.
4881
1359226
https://en.wikipedia.org/wiki?curid=4881
Bachelor
A bachelor is a man who is not and never has been married. Etymology. A bachelor is first attested as the 12th-century "bacheler": a knight bachelor, a knight too young or poor to gather vassals under his own banner. The Old French ' presumably derives from Provençal ' and Italian ', but the ultimate source of the word is uncertain. The proposed Medieval Latin * ("vassal", "field hand") is only attested late enough that it may have derived from the vernacular languages, rather than from the southern French and northern Spanish Latin . Alternatively, it has been derived from Latin ' ("a stick"), in reference to the wooden sticks used by knights in training. History. From the 14th century, the term "bachelor" was also used for a junior member of a guild (otherwise known as "yeomen") or university and then for low-level ecclesiastics, as young monks and recently appointed canons. As an inferior grade of scholarship, it came to refer to one holding a "bachelor's degree". This sense of ' or ' is first attested at the University of Paris in the 13th century in the system of degrees established under the auspices of Pope Gregory IX as applied to scholars still '. There were two classes of ': the ', theological candidates passed for admission to the divinity course, and the ', who had completed the course and were entitled to proceed to the higher degrees. In the Victorian era, the term "eligible bachelor" was used in the context of upper class matchmaking, denoting a young man who was not only unmarried and eligible for marriage, but also considered "eligible" in financial and social terms for the prospective bride under discussion. Also in the Victorian era, the term "confirmed bachelor" denoted a man who desired to remain single. By the later 19th century, the term "bachelor" had acquired the general sense of "unmarried man". The expression bachelor party is recorded 1882. In 1895, a feminine equivalent "bachelor-girl" was coined, replaced in US English by "bachelorette" by the mid-1930s. This terminology is now generally seen as antiquated, and has been largely replaced by the gender-neutral term "single" (first recorded 1964). In England and Wales, the term "bachelor" remained the official term used for the purpose of marriage registration until 2005, when it was abolished in favor of "single." Bachelors have been subject to penal laws in many countries, most notably in Ancient Sparta and Rome. At Sparta, men unmarried after a certain age were subject to various penalties (, "atimía"): they were forbidden to watch women's gymnastics; during the winter, they were made to march naked through the agora singing a song about their dishonor; and they were not provided with the traditional respect due to the elderly. Some Athenian laws were similar. Over time, some punishments developed into no more than a teasing game. In some parts of Germany, for instance, men who were still unmarried by their 30th birthday were made to sweep the stairs of the town hall until kissed by a "virgin". In a 1912 Pittsburgh Press article, there was a suggestion that local bachelors should wear a special pin that identified them as such, or a black necktie to symbolize that "...they [bachelors] should be in perpetual mourning because they are so foolish as to stay unmarried and deprive themselves of the comforts of a wife and home." The idea of a tax on bachelors has existed throughout the centuries. Bachelors in Rome fell under the Lex Julia of 18 BC and the Lex Papia Poppaea of AD 9: these lay heavy fines on unmarried or childless people while providing certain privileges to those with several children. A law known as the Marriage Duty Act 1695 was imposed on single males over 25 years old by the English Crown to help generate income for the Nine Years' War. In Britain, taxes occasionally fell heavier on bachelors than other persons: examples include 6 & 7 Will. 3, the 1785 Tax on Servants, and the 1798 Income Tax. A study that was conducted by professor Charles Waehler at the University of Akron in Ohio on non-married heterosexual males deduced that once non-married men hit middle age, they will be less likely to marry and remain unattached later into their lives. The study concluded that there is only a 1-in-6 chance that men older than 40 will leave the single life, and that after the age 45, the odds fall to 1-in-20. In certain Gulf Arab countries, "bachelor" can refer to men who are single as well as immigrant men married to a spouse residing in their country of origin (due to the high added cost of sponsoring a spouse onsite). Bachelorette. The term "bachelorette" is sometimes used to refer to a woman who has never been married. The traditional female equivalent to bachelor is spinster, which is considered pejorative and implies unattractiveness (i.e. old maid, cat lady). The term "bachelorette" has been used in its place, particularly in the context of bachelorette parties and reality TV series "The Bachelorette".
4882
19600678
https://en.wikipedia.org/wiki?curid=4882
Background radiation
Background radiation is a measure of the level of ionizing radiation present in the environment at a particular location which is due to deliberate introduction of radiation sources. Background radiation originates from a variety of sources, both natural and artificial. These include both cosmic radiation and environmental radioactivity from naturally occurring radioactive materials (such as radon and radium), as well as man-made medical X-rays, fallout from nuclear weapons testing and nuclear accidents. Definition. Background radiation is defined by the International Atomic Energy Agency as "Dose or the dose rate (or an observed measure related to the dose or dose rate) attributable to all sources other than the one(s) specified. A distinction is thus made between the dose which is already in a location, which is defined here as being "background", and the dose due to a deliberately introduced and specified source. This is important where radiation measurements are taken of a specified radiation source, where the existing background may affect this measurement. An example would be measurement of radioactive contamination in a gamma radiation background, which could increase the total reading above that expected from the contamination alone. However, if no radiation source is specified as being of concern, then the total radiation dose measurement at a location is generally called the background radiation, and this is usually the case where an ambient dose rate is measured for environmental purposes. Background dose rate examples. Background radiation varies with location and time, and the following table gives examples: Natural background radiation. Radioactive material is found throughout nature. Detectable amounts occur naturally in soil, rocks, water, air, and vegetation, from which it is inhaled and ingested into the body. In addition to this "internal exposure", humans also receive "external exposure" from radioactive materials that remain outside the body and from cosmic radiation from space. The worldwide average natural dose to humans is about per year. This is four times the worldwide average artificial radiation exposure, which in 2008 amounted to about per year. In some developed countries, like the US and Japan, artificial exposure is, on average, greater than the natural exposure, due to greater access to medical imaging. In Europe, average natural background exposure by country ranges from under annually in the United Kingdom to more than annually for some groups of people in Finland. The International Atomic Energy Agency states: "Exposure to radiation from natural sources is an inescapable feature of everyday life in both working and public environments. This exposure is in most cases of little or no concern to society, but in certain situations the introduction of health protection measures needs to be considered, for example when working with uranium and thorium ores and other Naturally Occurring Radioactive Material (NORM). These situations have become the focus of greater attention by the Agency in recent years." Terrestrial sources. Terrestrial background radiation, for the purpose of the table above, only includes sources that remain external to the body. The major radionuclides of concern are potassium, uranium and thorium and their decay products, some of which, like radium and radon are intensely radioactive but occur in low concentrations. Most of these sources have been decreasing, due to radioactive decay since the formation of the Earth, because there is no significant amount currently transported to the Earth. Thus, the present activity on Earth from uranium-238 is only half as much as it originally was because of its 4.5 billion year half-life, and potassium-40 (half-life 1.25 billion years) is only at about 8% of original activity. But during the time that humans have existed the amount of radiation has decreased very little. Many shorter half-life (and thus more intensely radioactive) isotopes have not decayed out of the terrestrial environment because of their on-going natural production. Examples of these are radium-226 (decay product of thorium-230 in decay chain of uranium-238) and radon-222 (a decay product of radium-226 in said chain). Thorium and uranium (and their daughters) primarily undergo alpha and beta decay, and are not easily detectable. However, many of their daughter products are strong gamma emitters. Thorium-232 is detectable via a 239 keV peak from lead-212, 511, 583 and 2614 keV from thallium-208, and 911 and 969 keV from actinium-228. Uranium-238 manifests as 609, 1120, and 1764 keV peaks of bismuth-214 ("cf." the same peak for atmospheric radon). Potassium-40 is detectable directly via its 1461 keV gamma peak. The level over the sea and other large bodies of water tends to be about a tenth of the terrestrial background. Conversely, coastal areas (and areas by the side of fresh water) may have an additional contribution from dispersed sediment. Airborne sources. The biggest source of natural background radiation is airborne radon, a radioactive gas that emanates from the ground. Radon and its isotopes, parent radionuclides, and decay products all contribute to an average inhaled dose of 1.26 mSv/a (millisievert per year). Radon is unevenly distributed and varies with weather, such that much higher doses apply to many areas of the world, where it represents a significant health hazard. Concentrations over 500 times the world average have been found inside buildings in Scandinavia, the United States, Iran, and the Czech Republic. Radon is a decay product of uranium, which is relatively common in the Earth's crust, but more concentrated in ore-bearing rocks scattered around the world. Radon seeps out of these ores into the atmosphere or into ground water or infiltrates into buildings. It can be inhaled into the lungs, along with its decay products, where they will reside for a period of time after exposure. Although radon is naturally occurring, exposure can be enhanced or diminished by human activity, notably house construction. A poorly sealed dwelling floor, or poor basement ventilation, in an otherwise well insulated house can result in the accumulation of radon within the dwelling, exposing its residents to high concentrations. The widespread construction of well insulated and sealed homes in the northern industrialized world has led to radon becoming the primary source of background radiation in some localities in northern North America and Europe. Basement sealing and suction ventilation reduce exposure. Some building materials, for example lightweight concrete with alum shale, phosphogypsum and Italian tuff, may emanate radon if they contain radium and are porous to gas. Radiation exposure from radon is indirect. Radon has a short half-life (4 days) and decays into other solid particulate radium-series radioactive nuclides. These radioactive particles are inhaled and remain lodged in the lungs, causing continued exposure. Radon is thus assumed to be the second leading cause of lung cancer after smoking, and accounts for 15,000 to 22,000 cancer deaths per year in the US alone. However, the discussion about the opposite experimental results is still going on. About 100,000 Bq/m3 of radon was found in Stanley Watras's basement in 1984. He and his neighbours in Boyertown, Pennsylvania, United States may hold the record for the most radioactive dwellings in the world. International radiation protection organizations estimate that a committed dose may be calculated by multiplying the equilibrium equivalent concentration (EEC) of radon by a factor of 8 to 9 and the EEC of thoron by a factor of 40 . Most of the atmospheric background is caused by radon and its decay products. The gamma spectrum shows prominent peaks at 609, 1120, and 1764 keV, belonging to bismuth-214, a radon decay product. The atmospheric background varies greatly with wind direction and meteorological conditions. Radon also can be released from the ground in bursts and then form "radon clouds" capable of traveling tens of kilometers. Cosmic radiation. The Earth and all living things on it are constantly bombarded by radiation from outer space. This radiation primarily consists of positively charged ions from protons to iron and larger nuclei derived from outside the Solar System. This radiation interacts with atoms in the atmosphere to create an air shower of secondary radiation, including X-rays, muons, protons, alpha particles, pions, electrons, and neutrons. The immediate dose from cosmic radiation is largely from muons, neutrons, and electrons, and this dose varies in different parts of the world based largely on the geomagnetic field and altitude. For example, the city of Denver in the United States (at 1650 meters elevation) receives a cosmic ray dose roughly twice that of a location at sea level. This radiation is much more intense in the upper troposphere, around 10 km altitude, and is thus of particular concern for airline crews and frequent passengers, who spend many hours per year in this environment. During their flights airline crews typically get an additional occupational dose between per year and 2.19 mSv/year, according to various studies. Similarly, cosmic rays cause higher background exposure in astronauts than in humans on the surface of Earth. Astronauts in low orbits, such as in the International Space Station or the Space Shuttle, are partially shielded by the magnetic field of the Earth, but also suffer from the Van Allen radiation belt which accumulates cosmic rays and results from the Earth's magnetic field. Outside low Earth orbit, as experienced by the Apollo astronauts who traveled to the Moon, this background radiation is much more intense, and represents a considerable obstacle to potential future long term human exploration of the Moon or Mars. Cosmic rays also cause elemental transmutation in the atmosphere, in which secondary radiation generated by the cosmic rays combines with atomic nuclei in the atmosphere to generate different nuclides. Many so-called cosmogenic nuclides can be produced, but probably the most notable is carbon-14, which is produced by interactions with nitrogen atoms. These cosmogenic nuclides eventually reach the Earth's surface and can be incorporated into living organisms. The production of these nuclides varies slightly with short-term variations in solar cosmic ray flux, but is considered practically constant over long scales of thousands to millions of years. The constant production, incorporation into organisms and relatively short half-life of carbon-14 are the principles used in radiocarbon dating of ancient biological materials, such as wooden artifacts or human remains. The cosmic radiation at sea level usually manifests as 511 keV gamma rays from annihilation of positrons created by nuclear reactions of high energy particles and gamma rays. At higher altitudes there is also the contribution of continuous bremsstrahlung spectrum. Food and water. Two of the essential elements that make up the human body, namely potassium and carbon, have radioactive isotopes that add significantly to our background radiation dose. An average human contains about 17 milligrams of potassium-40 (40K) and about 24 nanograms (10−9 g) of carbon-14 (14C), (half-life 5,730 years). Excluding internal contamination by external radioactive material, these two are the largest components of internal radiation exposure from biologically functional components of the human body. About 4,000 nuclei of 40K decay per second, and a similar number of 14C. The energy of beta particles produced by 40K is about 10 times that from the beta particles from 14C decay. 14C is present in the human body at a level of about 3700 Bq (0.1 μCi) with a biological half-life of 40 days. This means there are about 3700 beta particles per second produced by the decay of 14C. However, a 14C atom is in the genetic information of about half the cells, while potassium is not a component of DNA. The decay of a 14C atom inside DNA in one person happens about 50 times per second, changing a carbon atom to one of nitrogen. The global average internal dose from radionuclides other than radon and its decay products is 0.29 mSv/a, of which 0.17 mSv/a comes from 40K, 0.12 mSv/a comes from the uranium and thorium series, and 12 μSv/a comes from 14C. Areas with high natural background radiation. Some areas have greater dosage than the country-wide averages. In the world in general, exceptionally high natural background locales include Ramsar in Iran, Guarapari in Brazil, Karunagappalli in India, Arkaroola in Australia and Yangjiang in China. The highest level of purely natural radiation ever recorded on the Earth's surface was 90 μGy/h on a Brazilian black beach ("areia preta" in Portuguese) composed of monazite. This rate would convert to 0.8 Gy/a for year-round continuous exposure, but in fact the levels vary seasonally and are much lower in the nearest residences. The record measurement has not been duplicated and is omitted from UNSCEAR's latest reports. Nearby tourist beaches in Guarapari and Cumuruxatiba were later evaluated at 14 and 15 μGy/h. Note that the values quoted here are in Grays. To convert to Sieverts (Sv) a radiation weighting factor is required; these weighting factors vary from 1 (beta & gamma) to 20 (alpha particles). The highest background radiation in an inhabited area is found in Ramsar, primarily due to the use of local naturally radioactive limestone as a building material. The 1000 most exposed residents receive an average external effective radiation dose of per year, six times the ICRP recommended limit for exposure to the public from artificial sources. They additionally receive a substantial internal dose from radon. Record radiation levels were found in a house where the effective dose due to ambient radiation fields was per year, and the internal committed dose from radon was per year. This unique case is over 80 times higher than the world average natural human exposure to radiation. Epidemiological studies are underway to identify health effects associated with the high radiation levels in Ramsar. It is much too early to draw unambiguous statistically significant conclusions. While so far support for beneficial effects of chronic radiation (like longer lifespan) has been observed in few places only, a protective and adaptive effect is suggested by at least one study whose authors nonetheless caution that data from Ramsar are not yet sufficiently strong to relax existing regulatory dose limits. However, the recent statistical analyses discussed that there is no correlation between the risk of negative health effects and elevated level of natural background radiation. Photoelectric. Background radiation doses in the immediate vicinity of particles of high atomic number materials, within the human body, have a small enhancement due to the photoelectric effect. Neutron background. Most of the natural neutron background is a product of cosmic rays interacting with the atmosphere. The neutron energy peaks at around 1 MeV and rapidly drops above. At sea level, the production of neutrons is about 20 neutrons per second per kilogram of material interacting with the cosmic rays (or, about 100–300 neutrons per square meter per second). The flux is dependent on geomagnetic latitude, with a maximum near the magnetic poles. At solar minimums, due to lower solar magnetic field shielding, the flux is about twice as high vs the solar maximum. It also dramatically increases during solar flares. In the vicinity of larger heavier objects, e.g. buildings or ships, the neutron flux measures higher; this is known as "cosmic ray induced neutron signature", or "ship effect" as it was first detected with ships at sea. Artificial background radiation. Atmospheric nuclear testing. Frequent above-ground nuclear explosions between the 1940s and 1960s scattered a substantial amount of radioactive contamination. Some of this contamination is local, rendering the immediate surroundings highly radioactive, while some of it is carried longer distances as nuclear fallout; some of this material is dispersed worldwide. The increase in background radiation due to these tests peaked in 1963 at about 0.15 mSv per year worldwide, or about 7% of average background dose from all sources. The Limited Test Ban Treaty of 1963 prohibited above-ground tests, thus by the year 2000 the worldwide dose from these tests has decreased to only 0.005 mSv per year. This global fallout has caused an estimated 200,000-460,000 deaths as of 2020. Occupational exposure. The International Commission on Radiological Protection recommends limiting occupational radiation exposure to 50 mSv (5 rem) per year, and 100 mSv (10 rem) in 5 years. However, background radiation for occupational doses includes radiation that is not measured by radiation dose instruments in potential occupational exposure conditions. This includes both offsite "natural background radiation" and any medical radiation doses. This value is not typically measured or known from surveys, such that variations in the total dose to individual workers is not known. This can be a significant confounding factor in assessing radiation exposure effects in a population of workers who may have significantly different natural background and medical radiation doses. This is most significant when the occupational doses are very low. At an IAEA conference in 2002, it was recommended that occupational doses below 1–2 mSv per year do not warrant regulatory scrutiny. Nuclear accidents. Under normal circumstances, nuclear reactors release small amounts of radioactive gases, which cause small radiation exposures to the public. Events classified on the International Nuclear Event Scale as incidents typically do not release any additional radioactive substances into the environment. Large releases of radioactivity from nuclear reactors are extremely rare. To the present day, there were two major "civilian" accidents – the Chernobyl accident and the Fukushima I nuclear accidents – which caused substantial contamination. The Chernobyl accident was the only one to cause immediate deaths. Total doses from the Chernobyl accident ranged from 10 to 50 mSv over 20 years for the inhabitants of the affected areas, with most of the dose received in the first years after the disaster, and over 100 mSv for liquidators. There were 28 deaths from acute radiation syndrome. Total doses from the Fukushima I accidents were between 1 and 15 mSv for the inhabitants of the affected areas. Thyroid doses for children were below 50 mSv. 167 cleanup workers received doses above 100 mSv, with 6 of them receiving more than 250 mSv (the Japanese exposure limit for emergency response workers). The average dose from the Three Mile Island accident was 0.01 mSv. Non-civilian: In addition to the civilian accidents described above, several accidents at early nuclear weapons facilities – such as the Windscale fire, the contamination of the Techa River by the nuclear waste from the Mayak compound, and the Kyshtym disaster at the same compound – released substantial radioactivity into the environment. The Windscale fire resulted in thyroid doses of 5–20 mSv for adults and 10–60 mSv for children. The doses from the accidents at Mayak are unknown. Nuclear fuel cycle. The Nuclear Regulatory Commission, the United States Environmental Protection Agency, and other U.S. and international agencies, require that licensees limit radiation exposure to individual members of the public to 1 mSv (100 mrem) per year. Energy sources. Per UNECE life-cycle assessment, nearly all sources of energy result in some level of occupational and public exposure to radionuclides as result of their manufacturing or operations. The following table uses man·Sievert/GW-annum: Coal burning. Coal plants emit radiation in the form of radioactive fly ash which is inhaled and ingested by neighbours, and incorporated into crops. A 1978 paper from Oak Ridge National Laboratory estimated that coal-fired power plants of that time may contribute a whole-body committed dose of 19 μSv/a to their immediate neighbours in a radius of 500 m. The United Nations Scientific Committee on the Effects of Atomic Radiation's 1988 report estimated the committed dose 1 km away to be 20 μSv/a for older plants or 1 μSv/a for newer plants with improved fly ash capture, but was unable to confirm these numbers by test. When coal is burned, uranium, thorium and all the uranium daughters accumulated by disintegration – radium, radon, polonium – are released. Radioactive materials previously buried underground in coal deposits are released as fly ash or, if fly ash is captured, may be incorporated into concrete manufactured with fly ash. Other sources of dose uptake. Medical. The global average human exposure to artificial radiation is 0.6 mSv/a, primarily from medical imaging. This medical component can range much higher, with an average of 3 mSv per year across the USA population. Other human contributors include smoking, air travel, radioactive building materials, historical nuclear weapons testing, nuclear power accidents and nuclear industry operation. A typical chest x-ray delivers 20 μSv (2 mrem) of effective dose. A dental x-ray delivers a dose of 5 to 10 μSv. A CT scan delivers an effective dose to the whole body ranging from 1 to 20 mSv (100 to 2000 mrem). The average American receives about 3 mSv of diagnostic medical dose per year; countries with the lowest levels of health care receive almost none. Radiation treatment for various diseases also accounts for some dose, both in individuals and in those around them. Consumer items. Cigarettes contain polonium-210, originating from the decay products of radon, which stick to tobacco leaves. Heavy smoking results in a radiation dose of 160 mSv/year to localized spots at the bifurcations of segmental bronchi in the lungs from the decay of polonium-210. This dose is not readily comparable to the radiation protection limits, since the latter deal with whole body doses, while the dose from smoking is delivered to a very small portion of the body. Radiation metrology. In a radiation metrology laboratory, background radiation refers to the measured value from any incidental sources that affect an instrument when a specific radiation source sample is being measured. This background contribution, which is established as a stable value by multiple measurements, usually before and after sample measurement, is subtracted from the rate measured when the sample is being measured. This is in accordance with the International Atomic Energy Agency definition of background as being "Dose or dose rate (or an observed measure related to the dose or dose rate) attributable to all sources other than the one(s) specified. The same issue occurs with radiation protection instruments, where a reading from an instrument may be affected by the background radiation. An example of this is a scintillation detector used for surface contamination monitoring. In an elevated gamma background the scintillator material will be affected by the background gamma, which will add to the reading obtained from any contamination which is being monitored. In extreme cases it will make the instrument unusable as the background swamps the lower level of radiation from the contamination. In such instruments the background can be continually monitored in the "Ready" state, and subtracted from any reading obtained when being used in "Measuring" mode. Regular Radiation measurement is carried out at multiple levels. Government agencies compile radiation readings as part of environmental monitoring mandates, often making the readings available to the public and sometimes in near-real-time. Collaborative groups and private individuals may also make real-time readings available to the public. Instruments used for radiation measurement include the Geiger–Müller tube and the Scintillation detector. The former is usually more compact and affordable and reacts to several radiation types, while the latter is more complex and can detect specific radiation energies and types. Readings indicate radiation levels from all sources including background, and real-time readings are in general unvalidated, but correlation between independent detectors increases confidence in measured levels. List of near-real-time government radiation measurement sites, employing multiple instrument types: List of international near-real-time collaborative/private measurement sites, employing primarily Geiger-Muller detectors:
4886
7903804
https://en.wikipedia.org/wiki?curid=4886
Banquo
Lord Banquo , the Thane of Lochaber, is a semi-historical character in William Shakespeare's 1606 play "Macbeth". In the play, he is at first an ally of Macbeth (both are generals in the King's army) and they meet the Three Witches together. After prophesying that Macbeth will become king, the witches tell Banquo that he will not be king himself, but that his descendants will be. Later, Macbeth in his lust for power sees Banquo as a threat and has him murdered by three hired assassins; Banquo's son, Fleance, escapes. Banquo's ghost returns in a later scene, causing Macbeth to react with alarm in public during a feast. Shakespeare borrowed the character Banquo from "Holinshed's Chronicles", a history of Britain published by Raphael Holinshed in 1587. In "Chronicles", Banquo is an accomplice to Macbeth in the murder of the king, rather than a loyal subject of the king who is seen as an enemy by Macbeth. Shakespeare may have changed this aspect of his character to please King James, who was thought at the time to be a descendant of the real Banquo. Critics often interpret Banquo's role in the play as being a foil to Macbeth, resisting evil whereas Macbeth embraces it. Sometimes, however, his motives are unclear, and some critics question his purity. He does nothing to accuse Macbeth of murdering the king, even though he has reason to believe Macbeth is responsible. Sources. Shakespeare often used Raphael Holinshed's "Chronicles of England, Scotland, and Ireland", commonly known as "Holinshed's Chronicles", as a source for his plays, and in "Macbeth", he borrows from several of the tales in that work. Holinshed portrays Banquo as a historical figure, who is an accomplice in the murder by Mac Bethad mac Findlaích (Macbeth) of Donnchad mac Crínáin (King Duncan) and plays an important part in ensuring that Macbeth, not Máel Coluim mac Donnchada (Malcolm), takes the throne in the coup that follows. Holinshed in turn used an earlier work, the "Scotorum Historiae" (1526–7) by Hector Boece, as his source. Boece's work is the first known record of Banquo and his son Fleance (spelled "Banquho" and "Fleancho" in the Latin), and scholars such as David Bevington generally consider them fictional characters invented by Boece. In Shakespeare's day, however, they were considered historical figures of great repute, and the king, James I, based his claim to the throne in part on a descent from Banquo. Within the literature there exists various claims surrounding Thane Banquo's ancestry. According to the 17th century historian Frederic van Bossen, Thane Banquo (which he wrote as Banqwho and sometimes as Banchou) was the son of Dunclina, the daughter of Albanach ap Crinan, the thane of the Isles, and her husband Kenneth. Kenneth was the son of Fferqwhart, who was the son of son of Murdoch the Thane of "Lochabar", the son of Prince Dorus, who was the son of a King named Erlus, whose kingdom was not identified. According to Frederic van Bossen, Banquo married his 4th cousin Mauldvina the daughter of Thalus the Thane of Atholl, and together they were the parents of Fleance, a daughter called Castisa who married Frederic the Lord of Cromartie, and a number of other sons who were murdered by King Macbeth. It is known that the House of Stuart descends from Walter fitz Alan, Steward of Scotland, and in some studies he is believed to have been the grandson of Fleance and Gruffydd ap Llywelyn's daughter, Nesta ferch Gruffydd. However, in Frederic van Bossen's handwritten notes, which were created from numerous resources he collected in his travels through Europe, Fleance's wife is identified as Nesta's sister, Marjoretta the daughter of "griffin ap Livlein". In reality, Walter fitz Alan was the son of Alan fitz Flaad, a Breton knight. Unlike his sources, Shakespeare gives Banquo no role in the King's murder, making it a deed committed solely by Macbeth and his wife, Lady Macbeth. Why Shakespeare's Banquo is so different from the character described by Holinshed and Boece is not known, though critics have proposed several possible explanations. First among them is the risk associated with portraying the king's ancestor as a murderer and conspirator in the plot to overthrow a rightful king, as well as the author's desire to flatter a powerful patron. But Shakespeare may also simply have altered Banquo's character because there was no dramatic need for another accomplice to the murder. There was, however, a need to provide a dramatic contrast to Macbeth; a role that many scholars argue is filled by Banquo. Similarly, when Jean de Schelandre wrote about Banquo in his "Stuartide" in 1611, he also changed the character by portraying him as a noble and honourable man—the critic D.W. Maskell describes him as "...Schelandre's paragon of valour and virtue"—probably for reasons similar to Shakespeare's. Banquo's role in the coup that follows the murder is harder to explain. Banquo's loyalty to Macbeth, rather than Malcolm, after Duncan's death makes him a passive accomplice in the coup: Malcolm, as Prince of Cumberland, is the rightful heir to the throne and Macbeth a usurper. Daniel Amneus argued that "Macbeth" as it survives is a revision of an earlier play, in which Duncan granted Macbeth not only the title of Thane of Cawdor, but the "greater honor" of Prince of Cumberland (i.e. heir to the throne of Scotland). Banquo's silence may be a survival from the posited earlier play, in which Macbeth was the legitimate successor to Duncan. Role in the play. Banquo is in a third of the play's scenes, as both a human and a ghost. As significant as he is to the plot, he has fewer lines than the relatively insignificant Ross, a Scottish nobleman who survives the play. In the second scene of the play, a wounded soldier describes the manner in which Macbeth, Thane of Glamis, and Banquo, Thane of Lochaber, resisted invading forces, fighting side by side. In the next scene, Banquo and Macbeth, returning from the battle together, encounter the Three Witches, who predict that Macbeth will become Thane of Cawdor, and then king. Banquo, sceptical of the witches, challenges them to predict his own future, and they foretell that Banquo will never himself take the throne, but will beget a line of kings. Banquo remains sceptical after the encounter, wondering aloud if evil can ever speak the truth. He warns Macbeth that evil will offer men a small, hopeful truth only to catch them in a deadly trap. When Macbeth kills the king and takes the throne, Banquo—the only one aware of this encounter with the witches—reserves judgment for God. He is unsure whether Macbeth committed regicide to gain the throne, but muses in a soliloquy that "I fear / Thou play'dst most foully for 't". He offers his respects to the new King Macbeth and pledges loyalty. Later, worried that Banquo's descendants and not his own will rule Scotland, Macbeth sends two men, and then a Third Murderer, to kill Banquo and his son Fleance. During the melee, Banquo holds off the assailants so that Fleance can escape, but is himself killed. The ghost of Banquo later returns to haunt Macbeth at the banquet in Act Three, Scene Four. A terrified Macbeth sees him, while the apparition is invisible to his guests. He appears again to Macbeth in a vision granted by the Three Witches, wherein Macbeth sees a long line of kings descended from Banquo. Analysis. Foil to Macbeth. Many scholars see Banquo as a foil and a contrast to Macbeth. Macbeth, for example, eagerly accepts the Three Witches' prophecy as true and seeks to help it along. Banquo, on the other hand, doubts the prophecies and the intentions of these seemingly evil creatures. Whereas Macbeth places his hope in the prediction that he will be king, Banquo argues that evil only offers gifts that lead to destruction. Banquo steadily resists the temptations of evil within the play, praying to heaven for help, while Macbeth seeks darkness, and prays that evil powers will aid him. This is visible in act two; after Banquo sees Duncan to bed, he says: "There's husbandry in heaven, / Their candles are all out". This premonition of the coming darkness in association with Macbeth's murders is repeated just before Banquo is killed: "it will be rain to-night", Banquo tells his son Fleance. Banquo's status as a contrast to Macbeth makes for some tense moments in the play. In act two, scene one, Banquo meets his son Fleance and asks him to take both his sword and his dagger ("Hold, take my sword ... Take thee that too"). He also explains that he has been having trouble sleeping due to "cursed thoughts that nature / gives way to in repose!" On Macbeth's approach, he demands the sword returned to him quickly. Scholars have interpreted this to mean that Banquo has been dreaming of murdering the king as Macbeth's accomplice to take the throne for his own family, as the Three Witches prophesied to him. In this reading, his good nature is so revolted by these thoughts that he gives his sword and dagger to Fleance to be sure they do not come true, but is so nervous at Macbeth's approach that he demands them back. Other scholars have responded that Banquo's dreams have less to do with killing the king and more to do with Macbeth. They argue that Banquo is merely setting aside his sword for the night. Then, when Macbeth approaches, Banquo, having had dreams about Macbeth's deeds, takes back his sword as a precaution in this case. Macbeth eventually sees that Banquo can no longer be trusted to aid him in his evil, and considers his friend a threat to his newly acquired throne; thus, he has him murdered. Banquo's ability to live on in different ways is another oppositional force, in this case to Macbeth's impending death. His spirit lives on in Fleance, his son, and in his ghostly presence at the banquet. Ghost scenes. When Macbeth returns to the witches later in the play, they show him an apparition of the murdered Banquo, along with eight of his descendants. The scene carries deep significance: King James, on the throne when "Macbeth" was written, was believed to be separated from Banquo by nine generations. What Shakespeare writes here thus amounts to a strong support of James' right to the throne by lineage, and for audiences of Shakespeare's day, a very real fulfilment of the witches' prophecy to Banquo that his sons would take the throne. This apparition is also deeply unsettling to Macbeth, who not only wants the throne for himself, but also desires to father a line of kings. Banquo's other appearance as a ghost during the banquet scene serves as an indicator of Macbeth's conscience returning to plague his thoughts. Banquo's triumph over death appears symbolically, insofar as he literally takes Macbeth's seat during the feast. Shocked, Macbeth uses words appropriate to the metaphor of usurpation, describing Banquo as "crowned" with wounds. The spirit drains Macbeth's manhood along with the blood from his cheeks; as soon as Banquo's form vanishes, Macbeth announces: "Why, so; being gone, / I am a man again." Like the vision of Banquo's lineage, the banquet scene has also been the subject of criticism. Critics have questioned whether not one, but perhaps two ghosts appear in this scene: Banquo and Duncan. Scholars arguing that Duncan attends the banquet state that Macbeth's lines to the Ghost could apply equally well to the slain king. "Thou canst not say I did it", for example, can mean that Macbeth is not the man who actually killed Banquo, or it can mean that Duncan, who was asleep when Macbeth killed him, cannot claim to have seen his killer. To add to the confusion, some lines Macbeth directs to the ghost, such as "Thy bones are marrowless", cannot rightly be said of Banquo, who has only recently died. Scholars debate whether Macbeth's vision of Banquo is real or a hallucination. Macbeth had already seen a hallucination before murdering Duncan: a knife hovering in the air. Several performances of the play have even ignored the stage direction to have the Ghost of Banquo enter at all, heightening the sense that Macbeth is growing mad, since the audience cannot see what he claims to see. Scholars opposing this view claim that while the dagger is unusual, ghosts of murdered victims are more believable, having a basis in the audience's superstitions. Spirits in other Shakespeare plays—notably "Hamlet" and "Midsummer Night's Dream"—exist in ambiguous forms, occasionally even calling into question their own presence. The concept of a character being confronted at a triumphant feast with a reminder of their downfall is not unique to Shakespeare and may originate from Belshazzar's feast, as portrayed in the Bible. The term 'ghost at the feast' has entered popular culture, and is often used as a metaphor for a subject a person would rather avoid considering, or (considering the general plot of "Macbeth") a reminder of a person's unpleasant past or likely future. Performances and interpretations. Banquo's role, especially in the banquet ghost scene, has been subject to a variety of mediums and interpretations. Shakespeare's text states: "Enter Ghost of Banquo, and sits in Macbeth's place." Several television versions have altered this slightly, having Banquo appear suddenly in the chair, rather than walking onstage and into it. Special effects and camera tricks also allow producers to make the ghost disappear and reappear, highlighting the fact that "only" Macbeth can see it. Stage directors, unaided by post-production effects and camera tricks, have used other methods to depict the ghost. In the late 19th century, elaborate productions of the play staged by Henry Irving employed a wide variety of approaches for this task. In 1877 a green silhouette was used to create a ghostlike image; ten years later a trick chair was used to allow an actor to appear in the middle of the scene, and then again from the midst of the audience. In 1895 a shaft of blue light served to indicate the presence of Banquo's spirit. In 1933 a Russian director Theodore Komisarjevsky staged a modern retelling of the play (Banquo and Macbeth were told of their future through palmistry); he used Macbeth's shadow as the ghost. In 1936, Orson Welles directed the Federal Theatre Project production of the play, with an African-American cast that included Canada Lee in the role of Banquo. Film adaptations have approached Banquo's character in a variety of ways. Akira Kurosawa's 1957 adaptation "Throne of Blood" makes the character into Capitan Miki (played by Minoru Chiaki), slain by Macbeth's equivalent (Captain Washizu) when his wife explains that she is with child. News of Miki's death does not reach Washizu until after he has seen the ghost in the banquet scene. In Roman Polanski's 1971 adaptation, Banquo is played by acclaimed stage actor Martin Shaw, in a style reminiscent of earlier stage performances. Polanski's version also emphasises Banquo's objection to Macbeth's ascendency by showing him remaining silent as the other thanes around him hail Macbeth as king. In the 1990 film "Men of Respect", a reimagining of "Macbeth" as taking place among a New York Mafia crime family, the character of Banquo is named "Bankie Como" and played by American actor Dennis Farina.
4887
45402633
https://en.wikipedia.org/wiki?curid=4887
British Army
The British Army is the principal land warfare force of the United Kingdom. the British Army comprises 73,847 regular full-time personnel, 4,127 Gurkhas, 25,742 volunteer reserve personnel and 4,697 "other personnel", for a total of 108,413. The British Army traces back to 1707 and the formation of the united Kingdom of Great Britain which joined the Kingdoms of England and Scotland into a single state and, with that, united the English Army and the Scots Army as the British Army. The English Bill of Rights 1689 and Scottish Claim of Right Act 1689 require parliamentary consent for the Crown to maintain a peacetime standing army. Members of the British Army swear allegiance to the monarch as their commander-in-chief. The army is administered by the Ministry of Defence and commanded by the Chief of the General Staff. At its inception, being composed primarily of cavalry and infantry, the British Army was one of two "Regular" Forces (there were also separate "Reserve Forces") within the British military (those parts of the British Armed Forces tasked with land warfare, as opposed to the naval forces), with the other having been the "Ordnance Military Corps" (made up of the Royal Artillery, Royal Engineers, and the Royal Sappers and Miners) of the Board of Ordnance, which along with the originally civilian Commissariat Department, stores and supply departments, as well as barracks and other departments, were absorbed into the British Army when the Board of Ordnance was abolished in 1855. Various other civilian departments of the board were absorbed into the War Office. The British Army has seen action in major wars between the world's great powers, including the Seven Years' War, the American Revolutionary War, the Napoleonic Wars, the Crimean War and the First and Second World Wars. Britain's victories in most of these decisive wars allowed it to influence world events and establish itself as one of the world's leading military and economic powers. Since the end of the Cold War, the British Army has been deployed to a number of conflict zones, often as part of an expeditionary force, a coalition force or part of a United Nations peacekeeping operation. History. Formation. Until the Wars of the Three Kingdoms (1639–1653), neither England nor Scotland had had a standing army with professional officers and career corporals and sergeants. England relied on militia organised by local officials or private forces mobilised by the nobility, or on hired mercenaries from Europe. From the later Middle Ages until the Wars of the Three Kingdoms, when a foreign expeditionary force was needed, such as the one that Henry V of England took to France and that fought at the Battle of Agincourt (1415), the army, a professional one, was raised for the duration of the expedition. During the Wars of the Three Kingdoms, the members of the English Long Parliament realised that the use of county militia organised into regional associations (such as the Eastern Association), often commanded by local members of Parliament (both from the House of Commons and the House of Lords), while more than able to hold their own in the regions which Parliamentarians ('Roundheads") controlled, were unlikely to win the war. So Parliament initiated two actions. The Self-denying Ordinance forbade members of Parliament (with the notable exception of Oliver Cromwell, then a member of parliament and future Lord Protector) from serving as officers in the Parliamentary armies. This created a distinction between the civilians in Parliament, who tended to be Presbyterian and conciliatory to the Royalists ("Cavaliers") in nature, and a corps of professional officers, who tended to be Independent (Congregational) in theology. The second action was legislation for the creation of a Parliamentary-funded army, commanded by Lord General Thomas Fairfax, which became known as the New Model Army (originally phrased "new-modelled Army"). While this proved to be a war-winning formula, the New Model Army, being organised and politically active, went on to dominate the politics of the Interregnum and by 1660 was widely disliked. The New Model Army was paid off and disbanded at the later Restoration of the monarchy in 1660 with the accession of King Charles II. For many decades the alleged excesses of the New Model Army under the Protectorate / Commonwealth under Oliver Cromwell were used as propaganda (and still feature in Irish folklore) and the Whig Party element recoiled from allowing a standing army to continue with the agreed-upon rights and privileges under the return of a king. The militia acts of 1661 and 1662 prevented local authorities from calling up militia and oppressing their own local opponents. Calling up the militia was possible only if the king and local elites agreed to do so. King Charles II and his "Cavalier" / Royalist supporters favoured a new army under royal control, and immediately after the Restoration of 1660 to 1661 began working on its establishment. The first English Army regiments, including elements of the disbanded New Model Army, were formed between November 1660 and January 1661 and became a standing military force for England (financed by Parliament). The Royal Scots and Irish Armies were financed by the parliaments of Scotland and Ireland. Parliamentary control was established by the Bill of Rights 1689 and Claim of Right Act 1689, although the monarch continued to influence aspects of army administration until at least the end of the 19th century. After the Restoration, King Charles II pulled together four regiments of infantry and cavalry, calling them his guards, at a cost of £122,000 from his general budget. This became the foundation of the permanent English Army. By 1685, it had grown to number 7,500 soldiers in marching regiments, and 1,400 men permanently stationed in garrisons. A Monmouth Rebellion in 1685 allowed successor King James II to raise the forces to 20,000 men. There were 37,000 in 1678, when England played a role in the closing stage of the cross-channel Franco-Dutch War. After Protestant dual Monarchs William III, formerly William of the Dutch House of Orange, and his wife Mary II's joint accession to the throne after a short constitutional crisis with Parliament sending Mary's father, predecessor King James II, (who remained a Catholic) during his brief controversial reign, off the throne and into exile. England then involved itself in the War of the Grand Alliance on the Continent, primarily to prevent a possible French Catholic monarch organizing an invasion restoring the exiled James II (Queen Mary's father and still a Roman Catholic). Later in 1689, William III to solidify his and Mary's hold on the monarchy, expanded the new English army to 74,000, and then a few years later to 94,000 in 1694. Parliament was very nervous and reduced the cadre to 70,000 in 1697. Scotland and Ireland had theoretically separate military establishments, but they were unofficially merged with the English Crown force. By the time of the 1707 Acts of Union, many regiments of the English and Scottish armies were combined under one operational command and stationed in the Netherlands for the War of the Spanish Succession. Although all the regiments were now part of the new British military establishment, they remained under the old operational-command structure and retained much of the institutional ethos, customs and traditions of the standing armies created shortly after the Restoration of the Monarchy 47 years earlier. The order of seniority of the most-senior British Army line regiments is based on that of the earlier English army. Although technically the Scots Royal Regiment of Foot was raised in 1633 and is the oldest Regiment of the Line, Scottish and Irish regiments were only allowed to take a rank in the English army on the date of their arrival in England (or the date when they were first placed on the English establishment). In 1694, a board of general officers was convened to decide the rank of English, Irish and Scots regiments serving in the Netherlands; the regiment which became known as the Scots Greys were designated the 4th Dragoons because there were three English regiments raised prior to 1688 when the Scots Greys were first placed in the English establishment. In 1713, when a new board of general officers was convened to decide the rank of several regiments, the seniority of the Scots Greys was reassessed and based on their June 1685 entry into England. At that time there was only one English regiment of dragoons, and the Scots Greys eventually received the British Army rank of 2nd Dragoons. British Empire (1707–1914). After 1707, British continental policy was to contain expansion by competing powers such as France and Spain. Although Spain was the dominant global power during the previous two centuries and the chief threat to England's early trans-Atlantic colonial ambitions, its influence was now waning. The territorial ambitions of the French, however, led to the War of the Spanish Succession and the later Napoleonic Wars. Although the Royal Navy is widely regarded as vital to the rise of the British Empire, the British Army played an important role in the formation of colonies, protectorates and dominions in the Americas, Africa, Asia, India and Australasia. British soldiers captured strategically important sites and territories, with the army involved in wars to secure the empire's borders, internal safety and support friendly governments and princes. Among these actions were the French and Indian War / Seven Years' War, the American Revolutionary War, the Napoleonic Wars, the First and Second Opium Wars, the Boxer Rebellion, the New Zealand Wars, the Australian frontier wars, the Sepoy Rebellion of 1857, the first and second Boer Wars, the Fenian raids, the Irish War of Independence, interventions in Afghanistan (intended to maintain a buffer state between British India and the Russian Empire) and the Crimean War (to keep the Russian Empire to the north on the Black Sea at a safe distance by aiding the Ottoman Empire). Like the English Army, the British Army fought the kingdoms of Spain, France (including the First French Empire) and the Netherlands (Dutch Republic) for supremacy in North America and the West Indies. With native and provincial and colonial assistance, the Army conquered New France in the French and Indian War (North American theatre) of the parallel Seven Years' War and suppressed a Native / Indian North Americans uprising in Pontiac's War around the Great Lakes. The British Army was defeated in the American Revolutionary War, losing the Thirteen Colonies but retaining The Canadas and The Maritimes as in British North America, including Bermuda (originally part of the Colony of Virginia, and which had been originally strongly sympathetic to the American colonial rebels early in the war). Halifax, Nova Scotia and Bermuda were to become Imperial fortresses (although Bermuda, being safer from attack over water and impervious to attack overland, quickly became the most important in British North America), along with Malta and Gibraltar, providing bases in the eastern Atlantic Ocean and Mediterranean Sea for Royal Navy squadrons to control the oceans and trade routes, and heavily garrisoned by the British Army both for defence of the bases and to provide mobile military forces to work with the Navy in amphibious operations throughout their regions. The British Army was heavily involved in the Napoleonic Wars, participating in a number of campaigns in Europe (including continuous deployment in the Peninsular War), the Caribbean, North Africa and North America. The war between the British and the First French Empire of Napoleon Bonaparte stretched around the world; at its peak in 1813, the regular army contained over 250,000 men. A coalition of Anglo-Dutch and Prussian armies under the Duke of Wellington and Field Marshal von Blücher finally defeated Napoleon at Waterloo in 1815. The English were involved politically and militarily in Ireland. The campaign of English republican Protector Oliver Cromwell involved uncompromising treatment of the Irish towns (most notably Drogheda and Wexford) which supported the Royalists during the English Civil War. The English Army (and the subsequent British Army) remained in Ireland primarily to suppress Irish revolts or disorder. In addition to its conflict with Irish nationalists, it was faced with the prospect of battling Anglo-Irish and Ulster Scots in Ireland who were angered by unfavourable taxation of Irish produce imported into Britain. With other Irish groups, they raised a volunteer army and threatened to emulate the American colonists if their conditions were not met. Learning from their experience in America, the British government sought a political solution. The British Army fought Irish rebels—Protestant and Catholic—primarily in Ulster and Leinster (Wolfe Tone's United Irishmen) in the 1798 rebellion. In addition to battling the armies of other European empires (and its former colonies, the United States, in the War of 1812), the British Army fought the Chinese in the First and Second Opium Wars and the Boxer Rebellion, Māori tribes in the first of the New Zealand Wars, Nawab Shiraj-ud-Daula's forces and British East India Company mutineers in the Sepoy Rebellion of 1857, the Boers in the first and second Boer Wars, Irish Fenians in Canada during the Fenian raids and Irish separatists in the Anglo-Irish War. The increasing demands of imperial expansion and the inadequacy and inefficiency of the underfunded British Army, Militia, Ordnance Military Corps, Yeomanry and Volunteer Force after the Napoleonic Wars led to series of reforms following the failures of the Crimean War. Inspired by the successes of the Prussian Army (which relied on short-term conscription of all eligible young men to maintain a large reserve of recently discharged soldiers, ready to be recalled on the outbreak of war to immediately bring the small peacetime regular army up to strength), the "Regular Reserve" of the British Army was originally created in 1859 by Secretary of State for War Sidney Herbert, and re-organised under the Reserve Force Act 1867. Prior to this, a soldier was generally enlisted into the British Army for a 21-year engagement, following which (should he survive so long) he was discharged as a Pensioner. Pensioners were sometimes still employed on garrison duties, as were younger soldiers no longer deemed fit for expeditionary service who were generally organised in invalid units or returned to the regimental depot for home service. The cost of paying pensioners, and the obligation the government was under to continue to employ invalids as well as soldiers deemed by their commanding officers as detriments to their units were motivations to change this system. The long period of engagement also discouraged many potential recruits. The long service enlistments were consequently replaced with short service enlistments, with undesirable soldiers not permitted to re-engage on the completion of their first engagement. The size of the army also fluctuated greatly, increasing in war time, and drastically shrinking with peace. Battalions posted on garrison duty overseas were allowed an increase on their normal peacetime establishment, which resulted in their having surplus men on their return to a "Home" station. Consequently, soldiers engaging on short term enlistments were enabled to serve several years with the colours and the remainder in the Regular Reserve, remaining liable for recall to the colours if required. Among the other benefits, this thereby enabled the British Army to have a ready pool of recently trained men to draw upon in an emergency. The name of the Regular Reserve (which for a time was divided into a "First Class" and a "Second Class") has resulted in confusion with the "Reserve Forces", which were the pre-existing part-time, local-service home-defence forces that were auxiliary to the British Army (or "Regular Force"), but not originally part of it: the Yeomanry, Militia (or "Constitutional Force") and Volunteer Force. These were consequently also referred to as "Auxiliary Forces" or "Local Forces". The late-19th-century Cardwell and Childers Reforms gave the army its modern shape and redefined its regimental system. The 1907 Haldane Reforms created the Territorial Force as the army's volunteer reserve component, merging and reorganising the Volunteer Force and Yeomanry, while the Militia was replaced by the Special Reserve. World Wars (1914–1945). Great Britain was challenged by other powers, primarily the German Empire and Nazi Germany, during the 20th century. A century earlier it vied with Napoleonic France for global pre-eminence, and Hanoverian Britain's natural allies were the kingdoms and principalities of northern Germany. By the middle of the 19th century, Britain and France were allies in preventing Russia's appropriation of the Ottoman Empire, although the fear of French invasion led shortly afterwards to the creation of the Volunteer Force. By the first decade of the 20th century, the United Kingdom was allied with France (by the Entente Cordiale) and Russia (which had a secret agreement with France for mutual support in a war against the Prussian-led German Empire and the Austro-Hungarian Empire). When the First World War broke out in August 1914 the British Army sent the British Expeditionary Force (BEF), consisting mainly of regular army troops, to France and Belgium. The fighting bogged down into static trench warfare for the remainder of the war. In 1915 the army created the Mediterranean Expeditionary Force to invade the Ottoman Empire via Gallipoli, an unsuccessful attempt to capture Constantinople and secure a sea route to Russia. The First World War was the most devastating in British military history, with nearly 800,000 men killed and over two million wounded. Early in the war, the BEF was virtually destroyed and was replaced first by volunteers and then by a conscript force. Major battles included those at the Somme and Passchendaele. Advances in technology saw the advent of the tank (and the creation of the Royal Tank Regiment) and advances in aircraft design (and the creation of the Royal Flying Corps) which would be decisive in future battles. Trench warfare dominated Western Front strategy for most of the war, and the use of chemical weapons (disabling and poison gases) added to the devastation. The Second World War broke out in September 1939 with the Soviet and German Army's invasion of Poland. British assurances to the Poles led the British Empire to declare war on Germany. As in the First World War, a relatively small BEF was sent to France but then hastily evacuated from Dunkirk as the German forces swept through the Low Countries and across France in May 1940. After the British Army recovered from its earlier defeats, it defeated the Germans and Italians at the Second Battle of El Alamein in North Africa in 1942–1943 and helped drive them from Africa. It then fought through Italy and, with the help of American, Canadian, Australian, New Zealand, Indian and Free French forces, was the principal organiser and participant in the D-Day invasion of Normandy on 6 June 1944; nearly half the Allied soldiers were British. In the Far East, the British Army rallied against the Japanese in the Burma Campaign and regained the British Far Eastern colonial possessions. Postcolonial era (1945–2000). After the Second World War the British Army was significantly reduced in size, although National Service continued until 1960. This period saw decolonisation begin with the partition and independence of India and Pakistan, followed by the independence of British colonies in Africa and Asia. The Corps Warrant, which is the official list of which bodies of the British Military (not to be confused with "naval") Forces were to be considered Corps of the British Army for the purposes of the Army Act, the Reserve Forces Act, 1882, and the Territorial and Reserve Forces Act, 1907, had not been updated since 1926 (Army Order 49 of 1926), although amendments had been made up to and including Army Order 67 of 1950. A new Corps Warrant was declared in 1951. Although the British Army was a major participant in Korea in the early 1950s and Suez in 1956, during this period Britain's role in world events was reduced and the army was downsized. The British Army of the Rhine, consisting of I (BR) Corps, remained in Germany as a bulwark against Soviet invasion. The Cold War continued, with significant technological advances in warfare, and the army saw the introduction of new weapons systems. Despite the decline of the British Empire, the army was engaged in Aden, Indonesia, Cyprus, Kenya and Malaya. In 1982, the British Army and the Royal Marines helped liberate the Falkland Islands during the conflict with Argentina after that country's invasion of the British territory. In the three decades following 1969, the army was heavily deployed in Northern Ireland's Operation Banner to support the Royal Ulster Constabulary (later the Police Service of Northern Ireland) in their conflict with republican paramilitary groups. The locally recruited Ulster Defence Regiment was formed, becoming home-service battalions of the Royal Irish Regiment in 1992 before it was disbanded in 2006. Over 700 soldiers were killed during the Troubles. Following the 1994–1996 IRA ceasefires and since 1997, demilitarisation has been part of the peace process and the military presence has been reduced. On 25 June 2007 the 2nd Battalion of the Princess of Wales's Royal Regiment left the army complex in Bessbrook, County Armagh, ending the longest operation in British Army history. Persian Gulf War. The British Army contributed 50,000 troops to the coalition which fought Iraq in the Persian Gulf War, and British forces controlled Kuwait after its liberation. Forty-seven British military personnel died during the war. Balkan conflicts. The army was deployed to former Yugoslavia in 1992. Initially part of the United Nations Protection Force, in 1995 its command was transferred to the Implementation Force (IFOR) and then to the Stabilisation Force in Bosnia and Herzegovina (SFOR); the commitment rose to over 10,000 troops. In 1999, British forces under SFOR command were sent to Kosovo and the contingent increased to 19,000 troops. Between early 1993 and June 2010, 72 British military personnel died during operations in the former Yugoslavian countries of Bosnia, Kosovo and Macedonia. The Troubles. Although there have been permanent garrisons in Northern Ireland throughout its history, the British Army was deployed as a peacekeeping force from 1969 to 2007 in Operation Banner. Initially, this was (in the wake of unionist attacks on nationalist communities in Derry and Belfast) to prevent further loyalist attacks on Catholic communities; it developed into support of the Royal Ulster Constabulary (RUC) and its successor, the Police Service of Northern Ireland (PSNI) against the Provisional Irish Republican Army (PIRA). Under the 1998 Good Friday Agreement, there was a gradual reduction in the number of soldiers deployed. In 2005, after the PIRA declared a ceasefire, the British Army dismantled posts, withdrew many troops and restored troop levels to those of a peacetime garrison. Operation Banner ended at midnight on 31 July 2007 after about 38 years of continuous deployment, the longest in British Army history. According to an internal document released in 2007, the British Army had failed to defeat the IRA but made it impossible for them to win by violence. Operation Helvetic replaced Operation Banner in 2007, maintaining fewer service personnel in a more-benign environment. Of the 300,000 troops who served in Northern Ireland since 1969, there were 763 British military personnel killed and 306 killed by the British military, mostly civilians. An estimated 100 soldiers committed suicide during Operation Banner or soon afterwards and a similar number died in accidents. A total of 6,116 were wounded. Sierra Leone The British Army deployed to Sierra Leone for Operation Palliser in 1999, under United Nations resolutions, to aid the government in quelling violent uprisings by militiamen. British troops also provided support during the 2014 West African Ebola virus epidemic. Recent history (2000–present). War in Afghanistan. In November 2001, as part of Operation Enduring Freedom with the United States, the United Kingdom deployed forces in Afghanistan to topple the Taliban in Operation Herrick. The 3rd Division were sent to Kabul to assist in the liberation of the capital and defeat Taliban forces in the mountains. In 2006 the British Army began concentrating on fighting Taliban forces and bringing security to Helmand Province, with about 9,500 British troops (including marines, airmen and sailors) deployed at its peak—the second-largest force after that of the US. In December 2012 Prime Minister David Cameron announced that the combat mission would end in 2014, and troop numbers gradually fell as the Afghan National Army took over the brunt of the fighting. Between 2001 and 26 April 2014 a total of 453 British military personnel died in Afghan operations. Operation Herrick ended with the handover of Camp Bastion on 26 October 2014, but the British Army maintained a deployment in Afghanistan as part of Operation Toral. Following an announcement by the US Government of the end of their operations in the Afghanistan, the Ministry of Defence announced in April 2021 that British forces would withdraw from the country by 11 September 2021. It was later reported that all UK troops would be out by early July. Following the collapse of the Afghan Army, and the completion of the withdrawal of civilians, all British troops had left by the end of August 2021. Iraq War. In 2003, the United Kingdom was a major contributor to the invasion of Iraq, sending a force of over 46,000 military personnel. The British Army controlled southern Iraq, and maintained a peace-keeping presence in Basra. All British troops were withdrawn from Iraq by 30 April 2009, after the Iraqi government refused to extend their mandate. One hundred and seventy-nine British military personnel died in Iraqi operations. The British Armed Forces returned to Iraq in 2014 as part of Operation Shader to counter the Islamic State (ISIL). Recent military aid. The British Army maintains a standing liability to support the civil authorities in certain circumstances, usually in either niche capabilities (e.g. explosive ordnance removal) or in general support of the civil authorities when their capacity is exceeded. In recent years this has been seen as army personnel supporting the civil authorities in the face of the 2001 United Kingdom foot-and-mouth outbreak, the 2002 firefighters strike, widespread flooding in 2005, 2007, 2009, 2013 and 2014, Operation Temperer following the Manchester Arena bombing in 2017 and, most recently, Operation Rescript during the COVID-19 pandemic. Baltic States. Since 2016, the British Army has maintained a presence in the Baltic States in support of the NATO Enhanced Forward Presence strategy which responded to the 2014 Russian annexation of Crimea. The British Army leads a multinational armoured battlegroup in Estonia under Operation Cabrit and contributes troops to another military battle group in Poland. As part of the NATO plans, Britain has committed a full mechanized infantry brigade to be on a high state of readiness to defend Estonia. Ukraine. Between 2015 and 2022, the British Army deployed Short Term Training Teams (STTTs) to Ukraine under Operation Orbital to help train the Armed Forces of Ukraine against further Russian aggression. This operation was succeeded by Operation Interflex in July 2022. Modern army. Personnel. The British Army has been a volunteer force since national service ended during the 1960s. Since the creation of the part-time, reserve Territorial Force in 1908 (renamed the Army Reserve in 2014), the full-time British Army has been known as the Regular Army. In July 2020 there were just over 78,800 Regulars, with a target strength of 82,000, and just over 30,000 Army Reservists, with a target strength of 30,000. All former Regular Army personnel may also be recalled to duty in exceptional circumstances during the 6-year period following completion of their Regular service, which creates an additional force known as the Regular Reserve. The table below illustrates British Army personnel figures from 1710 to 2025. Equipment. Infantry. The British Army's basic weapon is the 5.56 mm L85A2 or L85A3 assault rifle, with some specialist personnel using the L22A2 carbine variant (pilots and some tank crew). The weapon was traditionally equipped with either iron sights or an optical SUSAT, although other optical sights have been subsequently purchased to supplement these. The weapon can be enhanced further utilising the Picatinny rail with attachments such as the L17A2 under-barrel grenade launcher. In 2023, the Army Special Operations Brigade, which includes the Ranger Regiment, began using the L403A1, an AR-pattern rifle also used by the Royal Marines. Some soldiers are equipped with the 7.62mm L129A1 sharpshooter rifle, which in 2018 formally replaced the L86A2 Light Support Weapon. Support fire is provided by the L7 general-purpose machine gun (GPMG), and indirect fire is provided by L16 81mm mortars. Sniper rifles include the L118A1 7.62 mm, L115A3 and the AW50F, all manufactured by Accuracy International. The British Army utilises the Glock 17 as its side arm. Anti tank guided weapons include the Javelin, the medium range anti-tank guided weapon replacement for Milan, with overfly and direct attack modes of operation, and the NLAW. The Next-generation Light Anti-tank Weapon (NLAW) is the first, non-expert, short-range, anti-tank missile that rapidly knocks out any main battle tank in just one shot by striking it from above. Armour. The army's main battle tank is the Challenger 2, which is being upgraded to Challenger 3. It is supported by the Warrior tracked armoured vehicle as the primary infantry fighting vehicle, (which will soon be replaced by the Boxer 8x8 armoured fighting vehicle) and the Bulldog armoured personnel carrier. Light armoured units often utilise the Supacat "Jackal" MWMIK and Coyote tactical support vehicle for reconnaissance and fire support. Artillery. The army has three main artillery systems: the M270 multiple launch rocket system (MLRS), the Archer and the L118 light gun. The MLRS, first used in Operation Granby, has an standard range, or with the PrSM, up to 500 km. The Archer is a 155 mm self-propelled armoured gun with a range. The L118 light gun is a 105 mm towed gun, which is typically towed by a Pinzgauer all-terrain vehicle. To identify artillery targets, the army operates the TAIPAN artillery detection radar and utilises artillery sound ranging. For air defence it uses the new Sky Sabre system, which in 2021 replaced the Rapier. It also deploys the Very Short-Range Air Defence (VSHORAD) Starstreak HVM (high-velocity missile) launched by a single soldier or from a Stormer HVM vehicle-mounted launcher. Protected mobility. Where armour is not required or mobility and speed are favoured the British Army utilises protected patrol vehicles, such as the Panther variant of the Iveco LMV, the Foxhound, and variants of the Cougar family (such as the Ridgeback, Husky and Mastiff). For day-to-day utility work the army commonly uses the Land Rover Wolf, which is based on the Land Rover Defender. Engineers, utility and signals. Specialist engineering vehicles include bomb-disposal robots such as the T7 Multi-Mission Robotic System and the modern variants of the Armoured Vehicle Royal Engineers, including the Titan bridge-layer, Trojan armoured engineer vehicle, Terrier armoured digger. Day-to-day utility work uses a series of support vehicles, including six-, nine- and fifteen-tonne MAN trucks, Oshkosh heavy-equipment transporters (HET), close-support tankers, quad bikes and ambulances. Tactical communication uses the Bowman radio system, and operational or strategic communication is controlled by the Royal Corps of Signals. Aviation. The Army Air Corps (AAC) provides direct aviation support, with the Royal Air Force providing support helicopters. The primary attack helicopter is the Boeing AH-64E Apache which replaced the AgustaWestland Apache AH-1 in the anti-tank, anti-air defence, and anti-armour role. The AgustaWestland AW159 Wildcat is a dedicated intelligence, surveillance, target acquisition, and reconnaissance (ISTAR) helicopter. The Eurocopter AS 365N Dauphin is used for special operations aviation, primarily counter terrorism operations, within the UK. The army operates unmanned aerial vehicles in a surveillance role, such as the small Lockheed Martin Desert Hawk III. Structure. Army Headquarters is located in Andover, Hampshire, and is responsible for providing forces at operational readiness for employment by the Permanent Joint Headquarters. The command structure is hierarchical, with overall command residing with the Chief of the General Staff (CGS), who is immediately subordinate to The Chief of Defence Staff, the head of the British Armed Services. The CGS is supported by the Deputy Chief of the General Staff. Army Headquarters is further organised into two subordinate commands, Field Army and Home Command, each commanded by a lieutenant general. These two Commands serve distinct purposes and are divided into a structure of divisions and brigades, which themselves consist of a complex mix of smaller units such as Battalions. British Army units are either full-time 'Regular' units, or part-time Army Reserve units. Allied Rapid Reaction Corps. Led by a British Army three-star general, one of NATO's High Readiness (Land) Forces based in Gloucestershire, UK, with the following British units under its command: Field Army. Led by Commander Field Army, the Field Army is responsible for generating and preparing forces for current and contingency operations. The Field Army comprises: Home Command. Home Command is the British Army's supporting command; a generating, recruiting and training force that supports the Field Army and delivers UK resilience. It comprises Special Forces. The British Army contributes two of the three special forces formations to the United Kingdom Special Forces directorate: the Special Air Service (SAS) and Special Reconnaissance Regiment (SRR). The SAS consists of one regular and two reserve regiments. The regular regiment, 22 SAS, has its headquarters at Stirling Lines, Credenhill, Herefordshire. It consists of 5 squadrons (A, B, D, G and Reserve) and a training wing. 22 SAS is supported by 2 reserve regiments, 21 SAS and 23 SAS, which collectively form the Special Air Service (Reserve) (SAS [R]), who in 2020 were transferred back under the command of Director of Special Forces after previously being under the command of the 1st Intelligence, Surveillance and Reconnaissance Brigade. The SRR, formed in 2005, performs close reconnaissance and special surveillance tasks. The Special Forces Support Group, under the operational control of the Director of Special Forces, provides operational manoeuvring support to the United Kingdom Special Forces. Colonial units. The British Army historically included many units from what are now separate Commonwealth realms. When the English Empire was established in North America (including Bermuda), and the West Indies in the early 17th century there was no standing English Army, only the Militia, Yeomanry, and Royal bodyguards, of which the Militia, as the primary home-defence force, was immediately extended to the colonies. Colonial militias defended colonies single-handedly at first against indigenous peoples and European competitors. Once the standing English Army, later the British Army, came into existence and began to garrison the colonies, the colonial militias fought side by side with it in a number of wars, including the Seven Years' War. Some of the colonial militias rebelled during the American War of Independence. The militia fought alongside the regular British Army (and native allies) in defending British North America from their former countrymen during the War of 1812. Locally raised units in strategically located Imperial fortress colonies (including: Nova Scotia before the Canadian Confederation; Bermuda – which was treated as part of The Maritimes under the Commander-in-Chief at Nova Scotia until Canadian Confederation; Gibraltar; and Malta) and the Channel Islands were generally maintained from army funds and more fully integrated into the British Army as evident from their placements in British Army lists, unlike units such as the King's African Rifles. The larger colonies (Australia, New Zealand, Canada, South Africa, etc.) mostly achieved Commonwealth Dominion status before or after the First World War and were granted full legislative independence in 1931. While remaining within the British Empire, this placed their governments on a par with the British government, and hence their military units comprised separate armies (e.g. the Australian Army), although Canada retained the term "militia" for its military forces until the Second World War. From the 1940s, these dominions and many colonies chose full independence, usually becoming Commonwealth realms (as member states of the Commonwealth are known today). Units raised in self-governing and Crown colonies (those without local elected Legislatures, as was the case with British Hong Kong) that are part of the British realm remain under British Government control. As the territorial governments are delegated responsibility only for internal government, the UK Government, as the government of the Sovereign state, retains responsibility for national security and the defence of the fourteen remaining British Overseas Territories, of which six have locally raised regiments: Levels of Command. The structure of the British Army beneath the level of Divisions and Brigades is also hierarchical and command is based on rank. The table below details how many units within the British Army are structured, although there can be considerable variation between individual units: Whilst many units are organised as Battalions or Regiments administratively, the most common fighting unit is the combined arms unit known as a Battlegroup. This is formed around a combat unit and supported by units (or sub-units) from other capabilities. An example of a battlegroup would be two companies of armoured infantry (e.g. from the 1st Battalion of the Mercian Regiment), one squadron of heavy armour (e.g. A Squadron of the Royal Tank Regiment), a company of engineers (e.g. B Company of the 22nd Engineer Regiment), a Battery of artillery (e.g. D Battery of the 1st Regiment of the Royal Horse Artillery) and smaller attachments from medical, logistic and intelligence units. Typically organised and commanded by a battlegroup headquarters and named after the unit which provided the most combat units, in this example, it would be the 1 Mercian Battlegroup. This creates a self-sustaining mixed formation of armour, infantry, artillery, engineers and support units, commanded by a lieutenant colonel. Recruitment. The British Army primarily recruits from within the United Kingdom, but accept applications from all British citizens. It also accepts applications from Irish citizens and Commonwealth citizens, with certain restrictions. Since 2018 the British Army has been an equal-opportunity employer (with some legal exceptions due to medical standards), and does not discriminate based on race, religion or sexual orientation. Applicants for the Regular Army must be a minimum age of 16, although soldiers under 18 may not serve in operations, and the maximum age is 36. Applicants for the Army Reserve must be a minimum of 17 years and 9 months, and a maximum age of 43. Different age limits apply for Officers and those in some specialist roles. Applicants must also meet several other requirements, notably regarding medical health, physical fitness, past-criminal convictions, education, and regarding any tattoos and piercings. Soldiers and officers in the Regular Army now enlist for an initial period of 12 years, with options to extend if they meet certain requirements. Soldiers and officers are normally required to serve for a minimum of 4 years from date of enlistment and must give 12 months' notice before leaving; soldiers who joined before the age of 18 years old are normally required to serve for a minimum of 6 years. Oath of allegiance. All soldiers and commissioned officers must take an oath of allegiance upon joining the Army, a process known as attestation. Those who wish to swear by God use the following words: Others replace the words "swear by Almighty God" with "solemnly, sincerely and truly declare and affirm". Training. Candidates for the Army undergo common training, beginning with initial military training, to bring all personnel to a similar standard in basic military skills, which is known as Phase 1 training. They then undertake further specialist trade-training for their specific Regiment or Corps, known as Phase 2 training. After completing Phase 1 training a soldier is counted against the Army's trained strength, and upon completion of Phase 2 are counted against the Army's fully trained trade strength. Soldiers under the age of 17 and six months will complete Phase 1 training at the Army Foundation College. Infantry Soldiers will complete combined Phase 1 & 2 training at the Infantry Training Centre, Catterick, whilst all other Soldiers will attend Phase 1 training at the Army Training Centre Pirbright or Army Training Regiment, Winchester, and then complete Phase 2 training at different locations depending on their specialism. Officers conduct their initial training at the Royal Military Academy Sandhurst (RMAS), before also completing their Phase 2 training at multiple different locations. Flags and ensigns. The British Army's official flag is the Union Jack. The Army also has a non-ceremonial flag that is often seen flying from military buildings and is used at recruiting and military events and exhibitions. Traditionally most British Army units had a set of flags, known as the colours—normally a Regimental Colour and a King's Colour (the Union Jack). Historically these were carried into battle as a rallying point for the soldiers and were closely guarded. In modern units the colours are often prominently displayed, decorated with battle honours, and act as a focal point for Regimental pride. A soldier re-joining a regiment (upon recall from the reserve) is described as "re-called to the Colours". Ranks and insignia. Most ranks across the British Army are known by the same name regardless of which Regiment they are in. However, the Household Cavalry call many ranks by different names, the Royal Artillery refer to Corporals as Bombardiers, the Rifles spell Sergeant as Serjeant, and Private soldiers are known by a wide variety of titles; notably trooper, gunner, guardsman, kingsman, sapper, signaller, fusilier, craftsman and rifleman dependant on the Regiment they belong to. These names do not affect a soldier's pay or role. Reserve forces. The oldest of the Reserve Forces was the Militia Force (also referred to as the "Constitutional Force"), which (in the Kingdom of England, prior to 1707) was originally the main military defensive force (until the 1645 creation of the New Model Army, there otherwise were originally only Royal bodyguards, including the Yeomen Warders and the Yeomen of the Guard, with armies raised only temporarily for expeditions overseas), made up of civilians embodied for annual training or emergencies, which had used various schemes of compulsory service during different periods of its long existence. From the 1850s it recruited volunteers who engaged for terms of service. The Militia was originally an all-infantry force, though Militia coastal artillery, field artillery, and engineers units were introduced from the 1850s. Volunteer Force units were also frequently raised during wartime and disbanded upon peace. This was re-established as a permanent (i.e., in war and peace) part of the Reserve Forces in 1859. It differed from the Militia in a number of ways, most particularly in that volunteers did not commit to a term service, and were able to resign with fourteen days notice (except while embodied). As volunteer soldiers were originally expected to fund the cost of their own equipment, few tended to come from the labouring class among whom the Militia primarily recruited. The Yeomanry Force was made up of mounted units, organised similarly to the Volunteer Force, first raised during the two decades of war with France that followed the French Revolution. As with the Volunteers, members of the Yeomanry were expected to foot much of the cost of their own equipment, including their horses, and the make-up of the units tended to be from more affluent classes. Although Militia regiments were linked with British Army regiments during the course of the Napoleonic Wars to feed volunteers for service abroad into the regular army, and volunteers from the Reserve Forces served abroad either individually or in contingents, service companies, or battalions in a succession of conflicts from the Crimean War to the Second Boer War, personnel did not normally move between forces unless re-attested as a member of the new force, and units did not normally move from the Reserve Forces to become part of the Regular Forces, or vice versa. There were exceptions, however, as with the "New Brunswick Regiment of Fencible Infantry", raised in 1803, which became the 104th (New Brunswick) Regiment of Foot when it was transferred to the British Army on 13 September 1810. Another type of reserve force was created during the period between the French Revolution and the end of the Napoleonic Wars. Called Fencibles, these were disbanded after the Napoleonic Wars and not raised again, although the Royal Malta Fencible Regiment, later the "Royal Malta Fencible Artillery", existed from 1815 until the 1880s when it became the Royal Malta Artillery, and the Royal New Zealand Fencible Corps was formed in 1846. The Reserve Forces were raised locally (in Britain, under the control of Lords-Lieutenant of counties, and, in British colonies, under the colonial governors, and members originally were obliged to serve only within their locality (which, in the United Kingdom, originally meant within the county or other recruitment area, but was extended to anywhere in Britain, though not overseas). They have consequently also been referred to as "Local Forces". As they were (and in some cases "are") considered separate forces from the British Army, though still within the British military, they have also been known as "Auxiliary Forces". The Militia and Volunteer units of a colony were generally considered to be separate forces from the "Home" Militia Force and Volunteer Force in the United Kingdom, and from the Militia Forces and Volunteer Forces of other colonies. Where a colony had more than one Militia or Volunteer unit, they would be grouped as a Militia or Volunteer Force for that colony, such as the Jamaica Volunteer Defence Force. Officers of the Reserve Forces could not sit on Courts Martial of regular forces personnel. The Mutiny Act did not apply to members of the Reserve Forces. The "Reserve Forces" within the British Isles were increasingly integrated with the British Army through a succession of reforms (beginning with the Cardwell Reforms) of the British military forces over the last two decades of the Nineteenth Century and the early years of the Twentieth Century, whereby the Reserve Forces units mostly lost their own identities and became numbered Militia or Volunteer battalions of regular British Army corps or regiments. In 1908, the Yeomanry and Volunteer Force were merged to create the Territorial Force (changed to "Territorial Army" after the First World War), with terms of service similar to the army and Militia, and the Militia was renamed the "Special Reserve", After the First World War the Special Reserve was renamed the Militia, again, but permanently suspended (although a handful of Militia units survived in the United Kingdom, its colonies, and the Crown Dependencies). Although the Territorial Force was nominally still a separate force from the British Army, by the end of the century, at the latest, any unit wholly or partly funded from Army Funds was considered part of the British Army. Outside the United Kingdom-proper, this was generally only the case for those units in the Channel Islands or the Imperial fortress colonies (Nova Scotia, before Canadian confederation; Bermuda; Gibraltar; and Malta). The Bermuda Militia Artillery, Bermuda Militia Infantry, Bermuda Volunteer Engineers, and the Bermuda Volunteer Rifle Corps, by example were paid for by the War Office and considered part of the British Army, with their officers appearing as such in the "Army List" unlike those of many other colonial units deemed auxiliaries. Today, the British Army is the only Home British military force, including the various other forces it has absorbed, though British military units organised on Territorial Army lines remain in British Overseas Territories that are still not considered formally part of the British Army, with only the Royal Gibraltar Regiment and the Royal Bermuda Regiment (an amalgam of the old Bermuda Militia Artillery and Bermuda Volunteer Rifle Corps) appearing on the British Army order-of-precedence and in the Army List, as well as on the Corps Warrant (the official list of those British military forces that are considered corps of the British Army). In October 2012 the Ministry of Defence announced that the Territorial Army was to be renamed the Army Reserve. Uniforms. The British Army uniform has sixteen categories, ranging from ceremonial uniforms to combat dress to evening wear. No. 8 Dress, the day-to-day uniform, is known as "Personal Clothing System – Combat Uniform" (PCS-CU) and consists of a Multi-Terrain Pattern (MTP) windproof smock, a lightweight jacket and trousers with ancillary items such as thermals and waterproofs. The army has introduced tactical recognition flashes (TRFs); worn on the right arm of a combat uniform, the insignia denotes the wearer's regiment or corps. In addition to working dress, the army has a number of parade uniforms for ceremonial and non-ceremonial occasions. The most-commonly-seen uniforms are No. 1 Dress (full ceremonial, seen at formal occasions such as at the changing of the guard at Buckingham Palace) and No. 2 Dress (Service Dress), a brown khaki uniform worn for non-ceremonial parades. Working headdress is typically a beret, whose colour indicates its wearer's type of regiment. Beret colours are:
4888
41142078
https://en.wikipedia.org/wiki?curid=4888
Bruin
Bruin, (from Dutch for "brown"), is an English folk term for brown bear. Bruin, Bruins or BRUIN may also refer to:
4890
7903804
https://en.wikipedia.org/wiki?curid=4890
Bayesian probability
Bayesian probability ( or ) is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief. The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses; that is, with propositions whose truth or falsity is unknown. In the Bayesian view, a probability is assigned to a hypothesis, whereas under frequentist inference, a hypothesis is typically tested without being assigned a probability. Bayesian probability belongs to the category of evidential probabilities; to evaluate the probability of a hypothesis, the Bayesian probabilist specifies a prior probability. This, in turn, is then updated to a posterior probability in the light of new, relevant data (evidence). The Bayesian interpretation provides a standard set of procedures and formulae to perform this calculation. The term "Bayesian" derives from the 18th-century mathematician and theologian Thomas Bayes, who provided the first mathematical treatment of a non-trivial problem of statistical data analysis using what is now known as Bayesian inference. Mathematician Pierre-Simon Laplace pioneered and popularized what is now called Bayesian probability. Bayesian methodology. Bayesian methods are characterized by concepts and procedures as follows: Objective and subjective Bayesian probabilities. Broadly speaking, there are two interpretations of Bayesian probability. For objectivists, who interpret probability as an extension of logic, "probability" quantifies the reasonable expectation that everyone (even a "robot") who shares the same knowledge should share in accordance with the rules of Bayesian statistics, which can be justified by Cox's theorem. For subjectivists, "probability" corresponds to a personal belief. Rationality and coherence allow for substantial variation within the constraints they pose; the constraints are justified by the Dutch book argument or by decision theory and de Finetti's theorem. The objective and subjective variants of Bayesian probability differ mainly in their interpretation and construction of the prior probability. History. The term "Bayesian" derives from Thomas Bayes (1702–1761), who proved a special case of what is now called Bayes' theorem in a paper titled "An Essay Towards Solving a Problem in the Doctrine of Chances". In that special case, the prior and posterior distributions were beta distributions and the data came from Bernoulli trials. It was Pierre-Simon Laplace (1749–1827) who introduced a general version of the theorem and used it to approach problems in celestial mechanics, medical statistics, reliability, and jurisprudence. Early Bayesian inference, which used uniform priors following Laplace's principle of insufficient reason, was called "inverse probability" (because it infers backwards from observations to parameters, or from effects to causes). After the 1920s, "inverse probability" was largely supplanted by a collection of methods that came to be called frequentist statistics. In the 20th century, the ideas of Laplace developed in two directions, giving rise to "objective" and "subjective" currents in Bayesian practice. Harold Jeffreys' "Theory of Probability" (first published in 1939) played an important role in the revival of the Bayesian view of probability, followed by works by Abraham Wald (1950) and Leonard J. Savage (1954). The adjective "Bayesian" itself dates to the 1950s; the derived "Bayesianism", "neo-Bayesianism" is of 1960s coinage. In the objectivist stream, the statistical analysis depends on only the model assumed and the data analysed. No subjective decisions need to be involved. In contrast, "subjectivist" statisticians deny the possibility of fully objective analysis for the general case. In the 1980s, there was a dramatic growth in research and applications of Bayesian methods, mostly attributed to the discovery of Markov chain Monte Carlo methods and the consequent removal of many of the computational problems, and to an increasing interest in nonstandard, complex applications. While frequentist statistics remains strong (as demonstrated by the fact that much of undergraduate teaching is based on it ), Bayesian methods are widely accepted and used, e.g., in the field of machine learning. Justification. The use of Bayesian probabilities as the basis of Bayesian inference has been supported by several arguments, such as Cox axioms, the Dutch book argument, arguments based on decision theory and de Finetti's theorem. Axiomatic approach. Richard T. Cox showed that Bayesian updating follows from several axioms, including two functional equations and a hypothesis of differentiability. The assumption of differentiability or even continuity is controversial; Halpern found a counterexample based on his observation that the Boolean algebra of statements may be finite. Other axiomatizations have been suggested by various authors with the purpose of making the theory more rigorous. Dutch book approach. Bruno de Finetti proposed the Dutch book argument based on betting. A clever bookmaker makes a Dutch book by setting the odds and bets to ensure that the bookmaker profits—at the expense of the gamblers—regardless of the outcome of the event (a horse race, for example) on which the gamblers bet. It is associated with probabilities implied by the odds not being coherent. However, Ian Hacking noted that traditional Dutch book arguments did not specify Bayesian updating: they left open the possibility that non-Bayesian updating rules could avoid Dutch books. For example, Hacking writes "And neither the Dutch book argument, nor any other in the personalist arsenal of proofs of the probability axioms, entails the dynamic assumption. Not one entails Bayesianism. So the personalist requires the dynamic assumption to be Bayesian. It is true that in consistency a personalist could abandon the Bayesian model of learning from experience. Salt could lose its savour." In fact, there are non-Bayesian updating rules that also avoid Dutch books (as discussed in the literature on "probability kinematics" following the publication of Richard C. Jeffrey's rule, which is itself regarded as Bayesian). The additional hypotheses sufficient to (uniquely) specify Bayesian updating are substantial and not universally seen as satisfactory. Decision theory approach. A decision-theoretic justification of the use of Bayesian inference (and hence of Bayesian probabilities) was given by Abraham Wald, who proved that every admissible statistical procedure is either a Bayesian procedure or a limit of Bayesian procedures. Conversely, every Bayesian procedure is admissible. Personal probabilities and objective methods for constructing priors. Following the work on expected utility theory of Ramsey and von Neumann, decision-theorists have accounted for rational behavior using a probability distribution for the agent. Johann Pfanzagl completed the "Theory of Games and Economic Behavior" by providing an axiomatization of subjective probability and utility, a task left uncompleted by von Neumann and Oskar Morgenstern: their original theory supposed that all the agents had the same probability distribution, as a convenience. Pfanzagl's axiomatization was endorsed by Oskar Morgenstern: "Von Neumann and I have anticipated ... [the question whether probabilities] might, perhaps more typically, be subjective and have stated specifically that in the latter case axioms could be found from which could derive the desired numerical utility together with a number for the probabilities (cf. p. 19 of The Theory of Games and Economic Behavior). We did not carry this out; it was demonstrated by Pfanzagl ... with all the necessary rigor". Ramsey and Savage noted that the individual agent's probability distribution could be objectively studied in experiments. Procedures for testing hypotheses about probabilities (using finite samples) are due to Ramsey (1931) and de Finetti (1931, 1937, 1964, 1970). Both Bruno de Finetti and Frank P. Ramsey acknowledge their debts to pragmatic philosophy, particularly (for Ramsey) to Charles S. Peirce. The "Ramsey test" for evaluating probability distributions is implementable in theory, and has kept experimental psychologists occupied for a half century. This work demonstrates that Bayesian-probability propositions can be falsified, and so meet an empirical criterion of Charles S. Peirce, whose work inspired Ramsey. (This falsifiability-criterion was popularized by Karl Popper.) Modern work on the experimental evaluation of personal probabilities uses the randomization, blinding, and Boolean-decision procedures of the Peirce-Jastrow experiment. Since individuals act according to different probability judgments, these agents' probabilities are "personal" (but amenable to objective study). Personal probabilities are problematic for science and for some applications where decision-makers lack the knowledge or time to specify an informed probability-distribution (on which they are prepared to act). To meet the needs of science and of human limitations, Bayesian statisticians have developed "objective" methods for specifying prior probabilities. Indeed, some Bayesians have argued the prior state of knowledge defines "the" (unique) prior probability-distribution for "regular" statistical problems; cf. well-posed problems. Finding the right method for constructing such "objective" priors (for appropriate classes of regular problems) has been the quest of statistical theorists from Laplace to John Maynard Keynes, Harold Jeffreys, and Edwin Thompson Jaynes. These theorists and their successors have suggested several methods for constructing "objective" priors (Unfortunately, it is not always clear how to assess the relative "objectivity" of the priors proposed under these methods): Each of these methods contributes useful priors for "regular" one-parameter problems, and each prior can handle some challenging statistical models (with "irregularity" or several parameters). Each of these methods has been useful in Bayesian practice. Indeed, methods for constructing "objective" (alternatively, "default" or "ignorance") priors have been developed by avowed subjective (or "personal") Bayesians like James Berger (Duke University) and José-Miguel Bernardo (Universitat de València), simply because such priors are needed for Bayesian practice, particularly in science. The quest for "the universal method for constructing priors" continues to attract statistical theorists. Thus, the Bayesian statistician needs either to use informed priors (using relevant expertise or previous data) or to choose among the competing methods for constructing "objective" priors. (Partly reprinted in )
4892
1292612701
https://en.wikipedia.org/wiki?curid=4892
Bert Bell
De Benneville "Bert" Bell (February 25, 1895 – October 11, 1959) was an American professional football executive and coach. He was the fifth chief executive and second commissioner of the National Football League (NFL) from 1946 until his death in 1959. As commissioner, he introduced competitive parity into the NFL to improve the league's commercial viability and promote its popularity. Whereas Bell had become the chief executive in a sport that was largely seen as second-rate and heading a league still plagued by franchise instability, by his death the NFL was a financially sound sports enterprise and seriously challenging Major League Baseball for preeminence among sports attractions in the United States. Bell was posthumously inducted into the charter class of the Pro Football Hall of Fame. Bell played football at the University of Pennsylvania, where as quarterback, he led his team to an appearance in the 1917 Rose Bowl. After being drafted into the US Army during World War I, he returned to complete his collegiate career at Penn and went on to become an assistant football coach with the Quakers in the 1920s. During the Great Depression, he was an assistant coach for the Temple Owls and a co-founder and co-owner of the Philadelphia Eagles. With the Eagles, Bell led the way in cooperating with the other NFL owners to establish the NFL draft in order to afford the weakest teams the first opportunity to sign the best available players. He subsequently became sole proprietor of the Eagles, but the franchise suffered financially. Eventually, he sold the team and bought a share in the Pittsburgh Steelers. During World War II, Bell argued against the league suspending operations until the war's conclusion. After the war, he was elected NFL commissioner and sold his ownership in the Steelers. As commissioner, he implemented a proactive anti-gambling policy, negotiated a merger with the All-America Football Conference (AAFC), and unilaterally crafted the entire league schedule with an emphasis on enhancing the dramatic effect of late-season matches. During the Golden Age of Television, he tailored the game's rules to strengthen its appeal to mass media and enforced a policy of blacking out local broadcasts of home contests to safeguard ticket receipts. Amid criticism from franchise owners and under pressure from Congress, he unilaterally recognized the NFLPA and facilitated in the development of the first pension plan for the players. He survived to oversee the "Greatest Game Ever Played" and to envision what the league would become in the future. As commissioner, Bell oversaw the integration of the NFL. Although Fritz Pollard was the first African American to play in the NFL, appearing with three teams from 1922 to 1926, a “gentleman’s agreement” among the owners kept the sport segregated for another 20 years. In 1946, four black players began playing in the NFL. Early life (1895–1932). Bell was born De Benneville Bell, on February 25, 1895, in Philadelphia to John C. Bell and Fleurette de Benneville Myers. His father was an attorney who served a term as the Pennsylvania Attorney General. His older brother, John C. Jr., was born in 1892. Bert's parents were very wealthy, and his mother's lineage predated the American Revolutionary War. His father, a Quaker of the University of Pennsylvania (class of 1884) during the early days of American football, accompanied him to his first football game when Bell was six years old. Thereafter, Bell regularly engaged in football games with childhood friends. In 1904, Bell matriculated at the Episcopal Academy, the Delancey School from 1909 to 1911 and then the Haverford School until 1914. About this time, his father was installed as athletics director at Penn and helped form the National Collegiate Athletic Association (NCAA). At Haverford, Bell captained the school's football, basketball, and baseball teams, and "was awarded The Yale Cup [for being] 'The pupil who has done the most to promote athletics in the school.'" Although he excelled at baseball, his devotion was to football. His father, who was named a trustee at Penn in 1911, said of Bell's plans for college, "Bert will go to Penn or he will go to hell." University of Pennsylvania (1914–1919). Bell entered the University of Pennsylvania in Philadelphia in the fall of 1914. He majored in English and joined the Phi Kappa Sigma fraternity. In a rare accomplishment for a sophomore, he was named the starting quarterback by Penn coach George H. Brooke. He also was a defender, punter, and punt returner. After the team's 3–0 start, Bell temporarily shared possession of his quarterbacking duties until he subsequently reclaimed them later in the season, as Penn finished with a record of 3–5–2. Prior to Penn's 1916 season, Bell's mother died while he was en route to her bedside. He started the first game for the Quakers under new coach Bob Folwell, but mixed results left him platooned for the rest of the season. Penn finished with a record of 7–2–1. However, the Quakers secured an invitation to the 1917 Rose Bowl against the Oregon Ducks. Bell had the best offensive gain for Penn during their 0–14 loss to Oregon, a 20-yard run, but was replaced late in the game at quarterback after throwing an interception. In the 1917 season, Bell led Penn to a 9–2 record. Following the 1917 season, Bell registered with a Mobile Hospital Unit of the United States Army for World War I and was deployed to France in May 1918. As a result of his unit participating in hazardous duty, it received a congratulatory letter for bravery from General John J. Pershing, and Bell was promoted to first sergeant. After the war, Bell returned to the United States in March 1919. He returned to Penn as captain of the team in the fall and again performed erratically. The Quakers finished 1919 with a 6–2–1 record. Academically, his aversion to attending classes forced him to withdraw from Penn without a degree in early 1920. His collegiate days ended with his having been a borderline All-American, but this period of his life had proven that he "possessed the qualities of a leader." Early career (1920–1932). Bell assembled the Stanley Professionals in Chicago in 1920, but he disbanded it prior to playing any games because of negative publicity received by Chicago due to the Black Sox Scandal. He joined John Heisman's staff at Penn as an assistant coach in 1920, where he remained for several years. At Penn, he was well regarded as a football coach, and after its 1924 season, he drew offers for, but declined, head-coaching assignments at other universities. At least as early as 1926, his avocation was socializing and frequenting Saratoga Race Course, where he counted as friends Tim Mara, Art Rooney, and George Preston Marshall. In 1928, Bell tendered his resignation at Penn in protest over the emphasis on in-season scrimmages during practices by Lud Wray, a fellow assistant coach. Bell's resignation was accommodated prior to the start of the 1929 season. Bell was then an employee of the Ritz-Carlton in Philadelphia. At one point, he tried his hand as a stock broker and lost $50,000 () during the Wall Street Crash of 1929. His father bailed him out of his deprivation, and he returned to working at the Ritz. From 1930 until 1932, he was a backfield coach for Temple in Philadelphia. In 1932, Marshall tried to coax Bell into buying the rights to an NFL franchise, but Bell disparaged the league and ridiculed the idea. When Pop Warner was hired to coach Temple for the 1933 season, Warner chose to hire his own assistants, and Bell was let go by Temple. NFL career. Philadelphia Eagles (1933–1940). By early 1933, Bell's opinion on the NFL had changed, and he wanted to become an owner of a team based in Philadelphia. After being advised by the NFL that a prerequisite to a franchise being rendered in Philadelphia was that the Pennsylvania Blue Laws would have to be mollified, he was the "force majeure" in lobbying to getting the laws deprecated. He borrowed funds from Frances Upton, partnered with Wray, and he procured the rights to a franchise in Philadelphia, purchasing the Frankford Yellow Jackets which he christened as the Philadelphia Eagles, inspired by Franklin Delano Roosevelt's use of the American eagle symbol in choosing that name. After the inaugural 1933 Philadelphia Eagles season, Bell married Upton at St. Madeleine Sophie Roman Catholic Church in Philadelphia. Days later, his suggestion to bestow the winner of the NFL championship game with the Ed Thorp Memorial Trophy was affirmed. In 1934, the Eagles finished with a 4–7 record, The Eagles' inability to seriously challenge other teams made it difficult to sell tickets, and his failure to sign a talented college prospect led him to adduce that the only way to bring stability to the league was to institute a draft to ensure the weakest teams had an advantage in signing the preeminent players. In 1935, his proposal for a draft was accepted, and in February 1936, the first draft kicked off, at which he acted as Master of Ceremonies. Later that month, his first child, Bert Jr., was born. In the Eagles' first three years, the partners exhausted $85,000 (), and at a public auction, Bell became sole owner of the Eagles with a bid of $4,500 (). Austerity measures forced him to supplant Wray as head coach of the Eagles, wherein Bell led the Eagles to a 1–11 finish, their worst record ever. In December, an application for a franchise in Los Angeles was obstructed by Bell and Pittsburgh Steelers owner Rooney as they deemed it too far of a distance to travel for games. During the Eagles' 2–8–1 1937 season, his second child, John "Upton", was born. In the Eagles' first profitable season, 1938, they posted a 5–6 record. The Eagles finished 1–9–1 in 1939 and 1–10 in 1940. Pittsburgh Steelers (1940–1945). In December 1940, Bell conciliated the sale of Rooney's Steelers to Alexis Thompson, and then Rooney acquired half of Bell's interest in the Eagles. In a series of events known as the "Pennsylvania Polka", Rooney and Bell exchanged their entire Eagles roster and their "territorial rights" in Philadelphia to Thompson for his entire Steelers roster and his rights in Pittsburgh. Ostensibly, Rooney had provided assistance to Bell by rewarding him with a 20% commission on the sale of the Steelers. Bell became the Steelers head coach and Rooney became the general manager. During the training camp of Pittsburgh's inaugural season with the nickname Steelers, Bell was buoyant with optimism about the team's prospect, but he became crestfallen after Rooney denigrated the squad and flippantly remarked that they looked like the "[s]ame old Steelers" (SOS). After losing the first two games of the 1941 season, Rooney compelled Bell to resign as head coach. Bell's coaching career ended with a 10–46–2 record, his 0.179 winning percentage is second-lowest in NFL history to only Phil Handler's 0.105 for coaches with at least five seasons. And at 36 games under .500 he held the record for futility until John McKay passed him in 1983 and Marion Campbell passed him in 1988. His first daughter and last child, Jane Upton, was born several months after the season's conclusion. By 1943, 40% of the NFL rosters had been drafted into the United States Armed Forces for World War II. The resulting difficulty in fielding a full-strength squad led some owners to recommend the league should shut down until the war ended. Bell auspiciously argued against this as he feared they might not be able to resume operations easily after the war, and since Major League Baseball was continuing unabated, then they should also. Throughout Bell's affiliation with the Steelers, he suffered monetarily and Rooney bought an increasing allotment of the franchise from him. Compounding Bell's problems, Arch Ward organized the All-America Football Conference (AAFC) in 1944 to displace the NFL's sovereignty in professional football. Ward's AAFC promptly began luring players to join the league, which resulted in salaries being driven up drastically. In Bill Dudley's contract proceedings with the Steelers, he attributed Bell's anxiety during the negotiations to the rivalry from the AAFC. Furthermore, by the end of 1945, the Steelers were in their most economically perilous situation in its history. NFL commissioner (1946–1959). Election, Hapes-Filchock, and the NFL schedule (1946–1948). Elmer Layden was appointed the first NFL commissioner in 1941, but Ward appeared as dictating his hiring. Layden tendered his resignation for personal reasons January 1946. Bell, who was not well respected in Pittsburgh, was elected to replace him. He received a three-year contract at $20,000 per year (), and transacted a sale of his stake in the Steelers to Rooney, albeit for a price Bell did not construe was full-value. He was then immediately placed at the center of a controversy wherein the owners denied Dan Reeves permission to relocate the Cleveland Rams to Los Angeles. Bell moderated a settlement, and, as a result, the Los Angeles Rams were formed. As a precondition to the Rams leasing the Los Angeles Coliseum, they signed Kenny Washington, which marked the beginning of the end of racial segregation on the field, but also caused "'all hell to break loose'" amidst the owners. The drawing up of a regular-season schedule had been a perennial source of contention among the NFL owners since the league's inception. The crux of the problem was the scheduling of games meant weighing the interest of owners who, early in the season, wanted their franchises to confront teams that drew the largest crowds, versus owners who wanted to play the weaker franchises to pad their team's win–loss record. The resultant impasse coerced the owners, in 1946, to confer upon Bell the sole discretion in developing the league's schedule. He utilized this responsibility to, early in the season, pit the weaker teams against other weak teams, and the strong teams against other strong teams. His goal was to augment game attendances by keeping the difference in team standings to a minimum as deep into the season as possible. On the eve of the 1946 championship game, Bell was notified that Merle Hapes and Frank Filchock of the New York Giants had been implicated in a bribing scandal. Filchock was sanctioned by Bell to play in the game but Hapes was suspended. At the next NFL owners' meeting, Bell was worried the repercussions from this event would lead to his firing. However, he was pleasantly surprised to learn that his contract would be elevated to five years at $30,000 per year. Reinvigorated with renewed support, he persuaded the owners to allow him to put sudden-death overtime into the playoffs. Subsequently, he wrote an anti-gambling resolution into the league constitution, which empowered him with the ability to permanently ban any NFL associated personnel for betting on a game or for withholding information on a game being possibly fixed. Furthermore, to obstruct gamblers from getting inside information, he secreted the names of officials he would assign to games, and he directed each team to promulgate a precursory injury report which listed anyone who might not participate in a game. Eventually, he lobbied to get every state in the US to criminalize the fixing of sporting events and put employees on the payroll of the NFL to investigate potential betting scams. AAFC–NFL merger (1948–1950). The NFL's struggle against the AAFC generated stress on wages, attendance, marketing, and by 1949, it had prevented the NFL for showing a profit for three consecutive years. Bell and representatives from both leagues met to attempt a merger, but their efforts were fruitless. In an unrelated matter, he apprised the owners that attendance records had shown televising games locally had a negative impact on the sale of home tickets. Nevertheless, he actualized the NFL's first television contract—the 1949 championship game. Simultaneously, he dealt with a lawsuit from Bill Radovich, who had been blacklisted for leaving the Lions and gaining employment with the AAFC. Bell and the owners were advised by John C. Jr. that this lawsuit was potentially not winnable, and the ramifications from the outcome of the case weighed heavily on Bell. One of the primary impediments in an AAFC–NFL merger was the supposed violation of "territorial rights" claimed by Marshall. Eventually, Bell gathered enough support to effectuate a compromise with the AAFC. In late 1949, the leagues merged, as three AAFC teams (the Cleveland Browns, San Francisco 49ers, and Baltimore Colts) joined the NFL; a fourth AAFC team (Los Angeles Dons) merged with the Los Angeles Rams, and the other AAFC teams disbanded. Bell stayed on as commissioner with his contract extended from five to ten years Seeking to capitalize on the publicity of the residual AAFC–NFL rivalry, he utilized "exquisite dramatic" and business sense and allocated the 1950 opening game to a contest between the 1949 champion Eagles versus the perennial AAFC champion Browns. Feeling financially secure after the merger, he purchased his first home for himself and his family in Narberth, Pennsylvania. Marketing of the NFL (1950–1956). In 1950, Bell originated a blackout rule into the NFL which forbid all teams to televise their home games within a 75-mile radius of their stadium – except for the Rams. Consequently, the United States Department of Justice (DOJ) opened an investigation into a violation of the Sherman Antitrust Act. Rams attendance for 1950 dropped off by 50%, and this signaled a potential financial disaster. In 1951, he licensed the DuMont Television Network to air the championship games for the next five years, and he stipulated that teams were free to develop their own television contracts independently. However, preceding the 1951 season, he reimposed the blackout rule on all teams in the league. The DOJ filed suit over this and Bell publicly retorted, "You can't give fans a game for free on TV and also expect them to go to the ballpark"; nevertheless, the suit was ordered to trial for January 1952. After the 1951 season ended, he gained unilateral control over the setting of a television strategy for the NFL. He negotiated a deal with DuMont, which granted it the rights to nationally broadcast one regular-season game every week, and he directed that the income from this contract was to be shared equally between all the teams. In the DOJ's case, the judge ruled that the blackout policy was legal, but both Bell, and the franchises collectively, were enjoined from negotiating a TV contract; Bell was ecstatic. Later that year, Bell forced one of the owners of the Cleveland Browns to sell all of his shares in the team after Bell determined the owner had bet on Browns' football games. Although he hated to fly, at some indeterminate point, he visited the training camps of every team and lectured on the danger gamblers posed to the league. Bell authorized a Pro Bowl to be held at the end of each season in order to showcase the talents of the best players. But in the early 1950s, on the field activities sometimes denigrated to borderline "assault and battery" with teams' star players being viciously targeted by opposing players. He answered charges the league was too savage by saying, "'I have never seen a maliciously dirty football player in my life and I don't believe there are any.'" Nevertheless, he ordered broadcasts to follow a strict rule of conduct whereby TV announcers would not be permitted to criticize the game, and neither fights, nor injuries, could be televised by virtue in his belief that announcers were "'salesman for professional football [and] we do not want kids believing that engaging in fights is the way to play football.'" Bell was criticized for censoring TV broadcasts, a charge he dismissed as not pertinent because he believed he was not impeding the print media but only advertising a product. After CBS and NBC gained the rights to broadcast the games in 1956, he advised the franchises to avoid criticizing the games or the officials, and forewarned that TV would give "'us our greatest opportunity to sell the NFL and everyone must present to the public the greatest games ... combined with the finest sportsmanship.'" This relationship with television was the beginning of the NFL's rise to becoming America's most popular sport. Compromise with the NFLPA (1956–1957). In Radovich v. National Football League, the Supreme Court ruled in Radovich's favor and declared the NFL was subject to antitrust laws, and the implication was that the legality of the draft and reserve clause were dubious. Bell pressed a case in the media that the NFL should be exempted from antitrust regulations and proffered the league was a sport and not a business. He invited an investigation from Congress with respect to the court's ruling. The House Judiciary committee, chaired by Emanuel Celler—who believed the draft was illegal and should be abolished, convened in July 1957 to discuss the ramifications of the Radovich decision. Red Grange and Bell testified at the committee's solicitation and argued the draft was essential to the sport's success. Representatives of the NFLPA contradicted these statements and said the draft and the reserve clause were anti-labor, and it seemed as if Congress was going to accept their position. Faced with Congressional opposition, Bell formally recognized the NFLPA and declared he would negotiate with its representatives. However, Bell was speaking only for himself and without the auspices of the owners. At the next owners' meeting, Rooney admonished they either had to recognize the NFLPA or remove Bell as commissioner. In order to do this, they had to agree in a vote that required a "super-majority". Bell unsuccessfully attempted to persuade the owners to permit the NFLPA to act as a bargaining agent for the players. However, he did reach a compromise with the owners to get them to acquiesce to some of the NFLPA's requests for salary standards and health benefits. Final days (1958–1959). For the 1958 season, the duration of timeouts was extended from 60 to 90 seconds and Bell mandated officials call a few "TV timeouts" during each game — a change which triggered criticism from sportswriters. The 1958 championship game became the first NFL championship game decided in overtime, and it was considered to be the greatest football game ever played. The game further increased football's marketability to television advertising, and the drama associated with overtime was the catalyst. Years later, after witnessing Bell openly crying after the game, Raymond Berry attributed it to Bell's realization of the impact the game would have on the prevalence of the sport. The death of Mara in February unsettled Bell and he experienced a heart attack later that month. He converted to Catholicism that summer because of the lifelong urging of his wife, Mara's death, and his enduring friendship with Rooney, a practicing Catholic. Bell was advised by his doctor to avoid going to football games, to which he quipped, "I'd rather die watching football than in my bed with my boots off." Bell and his children attended an Eagles game on October 11 at Franklin Field against the Steelers (both his old teams). The Eagles held complimentary box seats for him and guests to watch the game, but he preferred to buy his own tickets and sit with the other fans. Sitting towards the end of the field near the end zone during the fourth quarter of the game, he suffered a fatal heart attack and died later that day at the nearby university hospital. League Treasurer Austin Gunsel was named interim NFL commissioner for the rest of the season. Afterwards, he was remembered as "a man of buoyant joviality, with a rough and ready wit, laughter and genuine humility and honesty, clearly innocent of pretense and [pretension]." His funeral was held at Narberth's St. Margaret Roman Catholic Church and Monsignor Cornelius P. Brennan delivered the eulogy, as close friends and admirers attended the mass. Dominic Olejniczak and all the extant owners of the NFL franchises were pallbearers. Bell was interred at Calvary Cemetery in West Conshohocken, Pennsylvania, northwest of Philadelphia. Bell had named Baltimore Colts owner Carroll Rosenbloom as his executor. Bell had been Rosenbloom's backfield coach at Penn in the early 1950s, and later had convinced Rosenbloom to purchase the Colts after becoming commissioner. Rosenbloom owned the Colts in 1958 when they won the greatest game ever played, and brought greater national attention to the NFL. After Bell's death, Rosenbloom hired Bell's sons Upton and Bert Jr. to work for the Colts. Legacy and honors. Bell was inducted into the Professional Football Hall of Fame, the Penn Athletics Hall of Fame, the Philadelphia Sports Hall of Fame, and Haverford's Athletic Hall of Fame. The Maxwell Football Club, which he founded in 1937, has presented the best NFL player of the year with the Bert Bell Award since 1959. The Bert Bell Benefit Bowl was exhibited in his honor from 1960 through 1969. A statue of Bell will be unveiled on October 25, 2024, at the University of Pennsylvania's Franklin Field. The historical maker to Bell in Narberth, Pennsylvania is within blocks of his family's home at 323 Haverford Avenue. Although he did not have the wherewithal to prevent the wholesale betting on games, he was proactive in ensuring games were not tampered with by gamblers, and he created the foundation of the contemporary NFL anti-gambling policy. Bell was criticized as being too strict with his refusal to let sold-out games to be televised locally. Nevertheless, his balancing of television broadcasts against protecting game attendance made the NFL the "healthiest professional sport in America", and he was the "leading protagonist in pro football's evolution into America's major sport." He had understood that the league needed a cooperative television contract with revenue-sharing, but he failed to overcome the obstacles to achieve it. He was portrayed by sportswriters as ensuring the owners treated the players fairly, and his decision to recognize the NFLPA in the face of adversity from owners was a "master stroke" in thwarting Congressional intervention. After he initiated terms for a pension plan with the players in 1959, little progress was made with the NFLPA, however, the first players' pension plan – the Bert Bell National Football League Retirement Plan – was approved in 1962. Bell's implementation of the draft did not show immediate results, but it was "the single greatest contributor to the [league]'s prosperity" in its first eighty-four years. His original version of the draft was later ruled unconstitutional, but his anchoring of the success of the league to competitive balance has been "hailed by contemporaries and sports historians". Bell's belief in the efficacy of parity for the financial health of the league is best summarized by his most famous quotation, made to the press in November 1952 to explain an uptick in stadium attendance: "The teams are so closely matched that on any given Sunday, any one team can beat any other team. Professional football has come down to the point where the psychological edge is the determining factor. Physically these teams are even, so it depends on the mental outlook of the squads to determine the winner. It's this factor that has made the game close and is bringing fans out to the ball parks."
4893
44572431
https://en.wikipedia.org/wiki?curid=4893
Bob Costas
Robert Quinlan Costas (born March 22, 1952) is an American sportscaster who is known for his long tenure with NBC Sports, from 1980 through 2019. He has received 28 Emmy awards for his work and was the prime-time host of 12 Olympic Games from 1988 until 2016. He is currently employed by Warner Bros. Discovery, where he does commentary on CNN. He is also employed by MLB Network, where he makes special appearances and once hosted an interview show called "Studio 42 with Bob Costas". Early life and education. Costas is the son of a Greek father, John George Costas, and an Irish mother, Jayne Costas (née Quinlan). He grew up in Commack, New York, and attended Commack High School South. He attended the S. I. Newhouse School of Public Communications at Syracuse University, but dropped out in 1974. Costas got his first radio experience as a freshman at WAER, a student run radio station. In the mid-1980s, he established the Robert Costas Scholarship at the Newhouse School, of which the first recipient was Mike Tirico in 1987. Broadcasting career. Early career. While studying communications in college, Costas began his professional career in 1973, at WSYR-TV (now WSTM-TV) and WSYR-FM radio in Syracuse. He called for the minor league Syracuse Blazers of the Eastern Hockey League. After leaving school in 1974, he joined KMOX radio in St. Louis. He covered games of the American Basketball Association (ABA). Costas would call Missouri Tigers basketball and co-host KMOX's "Open Line" call-in program. He did play-by-play for Chicago Bulls broadcasts on WGN-TV during the 1979–1980 NBA season. NBC Sports. In 1980, Costas was hired by NBC. Don Ohlmeyer, who at the time ran NBC's sports division, told 28-year-old Costas he looked like a 14-year-old. For many years, Costas hosted NBC's National Football League (NFL) coverage and National Basketball Association (NBA) coverage. He also did play-by-play for NBA and Major League Baseball (MLB) coverage. With the introduction of the NBC Sports Network, Costas also became the host of the new monthly interview program "Costas Tonight". Boxing. On March 30, 2015, it was announced that Costas would join forces with Marv Albert (blow-by-blow) and Al Michaels (host) on the April 11, 2015, edition of NBC's primetime "PBC on NBC" boxing series. Costas was added to serve as a special contributor for the event from Barclays Center in Brooklyn. He would narrate and write a feature on the storied history of boxing in New York City. Golf. Costas hosted NBC's coverage of the U.S. Open golf tournament from 2003 to 2014. Major League Baseball. For baseball telecasts, Costas teamed with Sal Bando (1982), Tony Kubek (from 1983 to 1989), and Joe Morgan and Bob Uecker (from 1994 to 2000). One of his most memorable broadcasts occurred on June 23, 1984 (in what would go down in baseball lore as "The Sandberg Game"). Costas, along with Tony Kubek, was calling the Saturday baseball "Game of the Week" from Chicago's Wrigley Field. The game between the Chicago Cubs and St. Louis Cardinals in particular was cited for putting Ryne Sandberg (as well as the 1984 Cubs in general, who would go on to make their first postseason appearance since 1945) "on the map". In the ninth inning, the Cubs, trailing 9–8, faced the premier relief pitcher of the time, Bruce Sutter. Sandberg, then not known for his power, slugged a home run to left field against the Cardinals' ace closer. Despite this dramatic act, the Cardinals scored two runs in the top of the tenth. Sandberg came up again in the tenth inning, facing a determined Sutter with one man on base. Sandberg then shocked the national audience by hitting a second home run, even farther into the left field bleachers, to tie the game again. The Cubs went on to win in the 11th inning. When Sandberg hit that second home run, Costas said, "Do you believe it?!" The Cardinals' Willie McGee also hit for the cycle in the same game. While hosting Game 4 of the 1988 World Series between the Los Angeles Dodgers and Oakland Athletics on NBC, Costas angered many members of the Dodgers (especially the team's manager, Tommy Lasorda) by commenting before the start of the game that the Dodgers quite possibly were about to put up the weakest-hitting lineup in World Series history. That comment ironically fired up the Dodgers' competitive spirit, to the point where a chant of "Kill Costas!" began among the clubhouse, while the Dodgers eventually rolled to a 4–1 series victory. Besides calling the 1989 American League Championship Series for NBC, Costas also filled in for a suddenly ill Vin Scully, who had come down with laryngitis, for Game2 of the 1989 National League Championship Series alongside Tom Seaver. Game2 of the NLCS took place on Thursday, October 5, which was an off day for the ALCS. NBC then decided to fly Costas from Toronto to Chicago to substitute for Scully on Thursday night. Afterward, Costas flew back to Toronto, where he resumed work on the ALCS the next night. Costas anchored NBC's pre- and post-game shows for NFL broadcasts and the pre and post-game shows for numerous World Series and Major League Baseball All-Star Games during the 1980s (the first being for the 1982 World Series). Costas did not get a shot at doing play-by-play (as the games on NBC were previously called by Vin Scully) for an All-Star Game until 1994 and a World Series until 1995 (when NBC split the coverage with ABC under "The Baseball Network" umbrella), when NBC regained Major League Baseball rights after a four-year hiatus (when the broadcast network television contract moved over to CBS, exclusively). It was not until 1997 when Costas finally got to do play-by-play for a World Series from start to finish. Costas ended up winning a Sports Emmy Award for Outstanding Sports Personality, Play-by-Play. In 1999, Costas teamed with his then-NBC colleague Joe Morgan to call two weekday night telecasts for ESPN. The first was on Wednesday, August 25 with the Detroit Tigers playing against the Seattle Mariners. On August 3, 2019, Costas alongside Paul O'Neill and David Cone called both games of a double-header between the New York Yankees and Boston Red Sox for the YES Network. Costas was filling in for Michael Kay, who was recovering from vocal cord surgery. On August 20, 2021, reports emerged that TBS was nearing an agreement with Costas to host their coverage of that year's NLCS This became true when TBS announce his role on October 7, 2021. On October 31, 2024, Costas announced that he was officially retiring from Major League Baseball play-by-play calling after 44 years. This means that his final Major League Baseball broadcast as a play-by-play announcer was Game 4 of the 2024 American League Division Series between the New York Yankees and Kansas City Royals, airing on TBS. NASCAR. In November 2017, it was announced that Costas would co-anchor alongside Krista Voda on NBC's pre-race coverage leading into the NASCAR Cup Series finale from Homestead. In addition to hosting pre-race coverage, Costas would conduct a live interview with incoming NBC broadcaster Dale Earnhardt Jr., who was running his final race. National Basketball Association. Costas served as NBC's lead play-by-play announcer for their National Basketball Association (NBA) broadcasts from 1997-2000. In that time frame, Costas called three NBA Finals including the 1998 installment (which set an all-time ratings record for the NBA) between the Chicago Bulls and Utah Jazz. Costas was paired with Isiah Thomas and Doug Collins on NBC's NBA telecast. Following the 2000 NBA Finals, he was replaced by Marv Albert as the lead play-by-play announcer, who incidentally, the man he directly replaced on the "NBA on NBC" in the first place. Costas had previously presided as host of NBC's pre-game show, "NBA Showtime", while also providing play-by-play as a fill-in when necessary. Costas later co-anchored (with Hannah Storm) NBC's NBA Finals coverage in 2002, which was their last to-date (before the NBA's network television contract moved to ABC). Professional football. Costas began as a play-by-play announcer, working with analyst Bob Trumpy. In 1984, he would replace Len Berman as studio host. Among his NFL colleagues was O.J. Simpson, who had called 30 Rockefeller Plaza asking to speak to Costas during Simpson's infamous police chase through the freeways of Los Angeles. However, Costas was several blocks away at Madison Square Garden covering Game 5 of the 1994 NBA Finals. Costas learned of the attempted contact when visiting Simpson in prison later that year. Costas remained NFL studio host until 1992, when he was replaced by Jim Lampley. NBC Sports allowed Costas to opt out from having to cover the XFL. He publicly denigrated the league throughout its existence and remains a vocal critic of the XFL and its premise. In 2006, Costas returned to NFL studio hosting duties for NBC's new "Sunday Night Football", hosting its pre-game show "Football Night in America". Costas is nicknamed "Rapping Roberto" by New York City's "Daily News" sports media columnist Bob Raissman. Al Michaels also called him "Rapping Roberto" during the telecast between the Indianapolis Colts and the New York Giants on September 10, 2006, in response to Costas calling him "Alfalfa". Olympics (1988–2016). Costas has frontlined many Olympics broadcasts for NBC. They include Seoul in 1988, Barcelona in 1992, Atlanta in 1996, Sydney in 2000, Salt Lake City in 2002, Athens in 2004, Torino in 2006, Beijing in 2008, Vancouver in 2010, London in 2012, Sochi in 2014 and Rio in 2016. He discusses his work on the Olympic telecasts extensively in a book by Andrew Billings entitled "Olympic Media: Inside the Biggest Show on Television". A personal influence on Costas has been legendary ABC Sports broadcaster Jim McKay, who hosted many Olympics for ABC from the 1960s to the 1980s. During the 1992 Barcelona and 1996 Atlanta Opening Ceremonies, Costas's remarks on China's teams' possible drug use caused an uproar among the American Chinese and international communities. Thousands of dollars were raised to purchase ads in "The Washington Post" and Sunday "The New York Times", featuring an image of the head of a statue of Apollo and reading: "Costas Poisoned Olympic Spirit, Public Protests NBC". However, Costas's comments were made subsequent to the suspension of Chinese coach Zhou Ming after seven of his swimmers were caught using steroids in 1994. Further evidence of Chinese athletes' drug use came in 1997 when Australian authorities confiscated 13 vials of Somatropin, a human growth hormone, from the bag of Chinese swimmer Yuan Yuan upon her arrival for the 1997 World Swimming Championships. At the World Championships, four Chinese swimmers tested positive for the banned substance Triamterene, a diuretic used to dilute urine samples to mask the presence of anabolic steroids. Including these failed drug tests, 27 Chinese swimmers were caught using performance-enhancing drugs from 1990 through 1997; more than the rest of the world combined. Along with co-host Meredith Vieira and Matt Lauer, Costas's commentary of the 2012 Summer Olympics Opening Ceremonies came under fierce criticism, with Costas being described as making "a series of jingoistic remarks, including a joke about dictator Idi Amin when Uganda's team appeared" and the combined commentary as being "ignorant" and "banal". Following the Olympics, Costas appeared on Conan O'Brien's talk show and jokingly criticized his employer for its decision to air a preview of the upcoming series "Animal Practice" over a performance by The Who during the London closing ceremonies. "So here is the balance NBC has to consider: The Who, 'Animal Practice'. Roger Daltrey, Pete Townshend—monkey in a lab coat. I'm sure you'd be the first to attest, Conan, that when it comes to the tough calls, NBC usually gets 'em right," Costas said, alluding at the end to O'Brien's involvement in the 2010 "Tonight Show" conflict. An eye infection Costas had at the start of the 2014 Winter Olympics forced him, on February 11, 2014, to cede his Olympic hosting duties to Matt Lauer (four nights) and Meredith Vieira (two nights), the first time Costas had not done so at all since the 1998 Winter Olympics (as the rights were not held by NBC). Thoroughbred racing. From 2001 until 2018, Costas co-hosted the Kentucky Derby. In 2009, he hosted Bravo's coverage of the 2009 Kentucky Oaks. After Costas officially departed from NBC Sports, his role on NBC's thoroughbred racing coverage was essentially filled-in by Rebecca Lowe, beginning with the 2019 Kentucky Derby. Departure from NBC Sports. On February 9, 2017, Costas announced during "Today" that he had begun the process of stepping down from his main on-air roles at NBC Sports, announcing in particular that he would cede his role as primetime host for NBC's Olympics coverage to Mike Tirico (who joined the network from ESPN in 2016), and that he would host Super Bowl LII as his final Super Bowl. However, Costas ultimately dropped out of the coverage entirely. "USA Today" reported that he would similarly step down from "Football Night in America" in favor of Tirico. Costas explained that he was not outright retiring and expected to take on a role at NBC Sports similar to that of Tom Brokaw, being an occasional special correspondent to the division. He explained that his decision "opens up more time to do the things that I feel I'm most connected to; there will still be events, features, and interviews where I can make a significant contribution at NBC, but it will also leave more time for baseball (on MLB Network), and then, at some point down the road, I'll have a chance to do more of the long-form programming I enjoy." Costas told "USA Today" his gradual retirement was planned in advance, and that he did not want to announce it during the 2016 Summer Olympics or the NFL season because it would be too disruptive, and joked: "I'm glad that Sochi wasn't the last one. You wouldn't want your pink-eye Olympics to be your last Olympics." Costas's final major on-air broadcast for NBC was hosting the 2018 Belmont Stakes, where Justify won the Triple Crown. On January 15, 2019, it was announced that Costas had officially departed from NBC Sports after 40 years. On August 11, 2024, Costas made a rare guest appearance on NBC's coverage of the 2024 Summer Olympics for a segment previewing the 2028 Summer Olympics in Los Angeles, joining Tirico and Al Michaels in a discussion of notable moments from past Olympics hosted by the United States. Talk show hosting. Costas hosted the syndicated radio program "Costas Coast to Coast" from 1986 to 1996, which was revived as "Costas on the Radio". "Costas on the Radio", which ended its three-year run on May 31, 2009, aired on 200 stations nationwide each weekend and syndicated by the Clear Channel–owned Premiere Radio Networks. During that period, Costas also served as the imaging voice of Clear Channel–owned KLOU in St. Louis, Missouri, during that station's period as "My 103.3". Like "Later", Costas's radio shows have focused on a wide variety of topics and have not been limited to sports discussion. Later with Bob Costas aired on NBC from 1988 to 1994. Costas decided to leave "Later" after six seasons, having grown tired of the commute to New York City from his home in St. Louis and wishing to lighten his workload in order to spend more time with his family. He also turned down an offer from David Letterman, who moved to CBS in 1995, to follow him there and become the first host of "The Late Late Show", which was being developed by Letterman's company to air at 12:30 after the "Late Show with David Letterman". In June 2005, Costas was named by CNN president Jonathan Klein as a regular substitute anchor for Larry King's "Larry King Live" for one year. Costas, as well as Klein, made clear that Costas was not trying out for King's position on a permanent basis. Nancy Grace was also named a regular substitute host for the show. On August 18, 2005, Costas refused to host a "Larry King Live" broadcast where the subject was missing teenager Natalee Holloway. Costas said that because there were no new developments in the story, he felt it had no news value, and he was uncomfortable with television's drift in the direction of tabloid-type stories. Beginning in October 2011, Costas was a correspondent for "Rock Center with Brian Williams". He gained acclaim for his November 2011 live interview of former Pennsylvania State University assistant coach Jerry Sandusky concerning charges of sexual abuse of minors, in which Sandusky called in to deny the charges. Costas hosted a monthly talk show "Costas Tonight" on NBC Sports Network. HBO Sports. In 2001, Costas was hired by HBO to host a 12-week series called "On the Record with Bob Costas". In 2002, Costas began a stint as co-host of HBO's long-running series "Inside the NFL". Costas remained host of "Inside the NFL" through the end of the 2007 NFL season. He hosted the show with Cris Collinsworth and former NFL legends Dan Marino and Cris Carter. The program aired each week during the NFL season. Costas left HBO to sign with MLB Network in February 2009. On April 23, 2021, it was announced that Costas would be returning to HBO to host a quarter-yearly interview show called "Back on the Record". MLB Network. At the channel's launch on January 1, 2009, Costas hosted the premiere episode of "All Time Games", a presentation of the recently discovered kinescope of Game5 of the 1956 World Series. During the episode, he held a forum with Don Larsen, who pitched MLB's only postseason perfect game during that game, and Yogi Berra, who caught the game. Costas joined the network full-time on February 3, 2009. He hosted a regular interview show titled "MLB Network Studio 42 with Bob Costas" as well as special programming and provides play-by-play for select live baseball game telecasts. In 2017, Costas called Game1 of the American League Division Series between the Boston Red Sox and the Houston Astros on MLB Network. The Astros went on to win 8–2. Costas and his color commentator Jim Kaat received criticism for their "bantering about minutia" and misidentification of plays. Costas also went on to become an internet meme after using the term the "sacks were juiced" to describe the bases being loaded. NFL Network. As aforementioned, Costas hosted "Thursday Night Football" on NBC and NFL Network in 2016, having returned to broadcasting after a brief absence. He was replaced by Liam McHugh in 2017. CNN and TNT Sports. In July 2020, it was announced that Costas would join CNN as a contributor. According to CNN, Costas would provide commentary "on a wide range of sports-related issues as the industry adapts to new challenges posed by the coronavirus and the frequent intersection of sports with larger societal issues." Costas, who would continue working on MLB Network, said of joining CNN: “CNN’s willingness to devote time and attention to sports related topics, makes it a good fit for me.” On August 20, 2021, Andrew Marchand of the "New York Post" reported that TBS — a sister property via CNN parent WarnerMedia — was nearing an agreement with Costas which would have him hosting the network's National League Championship Series coverage. On October 7, 2021, Turner Sports announced that Costas would be joining TBS for their postseason baseball coverage starting on October 16. As of the 2022 MLB season, Costas provided play-by-play for TBS's Tuesday night baseball package during the regular season. He was the studio host for TBS's ALCS postseason coverage and also provided play-by-play for TBS's ALDS postseason coverage between the Cleveland Guardians and New York Yankees. This marked the first time since the 2000 ALCS on NBC that Costas provided play-by-play for a postseason baseball series in its entirety. Costas provided the play-by-play commentary on TBS for the 2024 American League Division Series between the New York Yankees and Kansas City Royals, receiving criticism for his monotonic delivery and perceived lack of interest in the events on the field. Following the series, Costas announced his retirement from calling MLB games. Other appearances. Costas provided significant contributions to the Ken Burns, PBS miniseries "Baseball" as well as its follow-up "The 10th Inning". He also appears in another PBS film, "A Time for Champions", produced by St. Louis's Nine Network of Public Media. Notable calls. June 23, 1984: Costas called NBC's "Game of the Week" with Tony Kubek, where Ryne Sandberg hit two separate home runs in the 9th and 10th innings against Bruce Sutter to tie the game. This game is known as "The Sandberg Game". Costas's call of the first home run: Into left center field, and deep. This is a tie ball game! Costas's call of the second home run: Costas: 1–1 pitch. [Sandberg swings] Kubek: OHHH BOY! Costas: [Over Kubek] And he hits it to deep left center! Look out! Do you believe it, it's gone! We will go to the 11th, tied at 11. October 28, 1995: Costas called Game 6 of the 1995 World Series, where the Atlanta Braves finally won their first ever World Series championship since moving to Atlanta in 1966. Left-center field, Grissom on the run. The team of the '90s has its World Championship! October 26, 1997: Costas called Game 7 of the 1997 World Series, where Édgar Rentería hit a walk off single to give the Florida Marlins their first World Series championship. Costas's call: The 0–1 pitch. A liner, off Nagy's glove, into center field! The Florida Marlins, have won the World Series! June 14, 1998: Costas called Game 6 of the 1998 NBA Finals, Michael Jordan and Phil Jackson's final game with the Chicago Bulls where Jordan hit a 20-foot jumpshot to put the Bulls up 87–86 with 5.2 seconds remaining. The Bulls would win the game by that score, giving them their sixth championship and third consecutive. Costas's call: Jordan with 43. Malone is doubled. They swat at him and steal it! Here comes Chicago. 17 seconds. 17 seconds, from Game 7, or from championship #6. Jordan, open, CHICAGO WITH THE LEAD! Timeout Utah, 5.2 seconds left. Michael Jordan, running on fumes, with 45 points. June 4, 2000: Costas called Game 7 of the 2000 Western Conference Finals for NBC's NBA coverage. Kobe Bryant threw an alley oop pass to Shaquille O'Neal to give the Lakers a six-point lead with 41.3 seconds remaining. Costas's call of the play: Portland has three timeouts left, the Lakers have two. Bryant... TO SHAQ! September 25, 2014: Costas called Derek Jeter's final game at Yankee Stadium for MLB Network, where he hit an RBI single to win the game. Costas's call: A base hit to right! Here comes Richardson, they're waving him home! The throw, it's close but he scores! On a walk off hit by Derek Jeter! Interests. Love of baseball. Costas is a devoted baseball fan. He's been suggested as a potential commissioner and wrote "Fair Ball: A Fan's Case for Baseball" in 2000. For his 40th birthday, then Oakland Athletics manager Tony La Russa allowed Costas to manage the club during a spring training game. The first time Costas visited baseball legend Stan Musial's St. Louis eatery, he left a $3.31 tip on a ten dollar tab in homage to Musial's lifetime batting average (.331). Costas delivered the eulogy at Mickey Mantle's funeral. In eulogizing Mantle, Costas described the baseball legend as ""a fragile hero to whom we had an emotional attachment so strong and lasting that it defied logic"." Costas has even carried a 1958 Mickey Mantle baseball card in his wallet. Costas also delivered the eulogy for Musial after his death in early 2013. Costas was outspoken about his disdain for Major League Baseball instituting a playoff wild card. Costas believed it diminishes the significance and drama of winning a divisional championship. He prefers a system in which winning the wild card puts a team at some sort of disadvantage, as opposed to an equal level with teams who outplayed them over a 162-game season. Or, as explained in his book "Fair Ball", have only the three division winners in each league go to the postseason, with the team with the best record receiving a bye to the League Championship Series. Once, on the air on HBO's "Inside the NFL", he mentioned that the NFL regular season counted for something, but baseball's was beginning to lose significance. With the advent of the second wild card, Costas has said he feels the format has improved, since there is now a greater premium placed on finishing first. He has suggested a further tweak: Make the wild card round a best two of three, instead of a single game, with all three games, if necessary, on the homefield of the wild card of the better record. He also has disdained the Designated Hitter rule, saying baseball would be a better game without it. Costas serves as a member of the advisory board of the Baseball Assistance Team, a 501(c)(3) non-profit organization dedicated to helping former Major League, Minor League, and Negro league players through financial and medical difficulties. Political views. Costas considers himself left of center but has said that he has voted for Republican candidates at times as well. On May 26, 2007, Costas discussed the presidency of George W. Bush on his radio show, stating he liked Bush personally, and had been optimistic about his presidency, but said the course of the Iraq War, and other mis-steps have led him to conclude Bush's presidency had "tragically failed" and considered it "overwhelmingly evident, even if you're a conservative Republican, if you're honest about it, this is a failed administration." The following summer, Costas interviewed Bush during the president's appearance at the 2008 Summer Olympics in Beijing. Controversies. Fastest man in the world. On August 1, 1996, the night of Michael Johnson's 200 m Olympic win, Costas stated on-air during Olympics coverage of the 1996 Olympics that Johnson's gold-medal performance in the 200 m (19.32 seconds) was faster than Donovan Bailey's 100 m performance (9.84 seconds) five days earlier in that 19.32 divided by two is 9.66. Bailey later dismissed Costas' comments as "a person who knew nothing about track talking about it with a lot of people listening"; nonetheless, the sportscaster's remarks touched a nerve. The unofficial "world's fastest man" title typically goes to the Olympic 100 metre champion. The 200 metre time almost always yields a "faster" average speed than a 100-metre race time, since the initial slow speed at the start is spread out over the longer distance. In other words, the second 100 metres is run with a "flying start", without the slow acceleration phase of the first 100 metres and without the greater than 0.10 s reaction time of the start. In fact, each 200 metre gold medalist from 1968, when fully electronic timing was introduced, to 1996 had a "faster" average speed at the Olympics, save one, yet there had been no controversy over the title of "world's fastest man" previously, until Bob Costas' remarks during the 1996 Olympics. Amid continuous verbal sparring between the pair of athletes, this led to an unsanctioned 150-metre race between Bailey and Johnson in Toronto. Gun culture controversy. During a segment on the "Sunday Night Football" halftime show on December 2, 2012, Costas paraphrased Fox Sports columnist Jason Whitlock in regard to Jovan Belcher's murder-suicide the day prior, saying the United States' gun culture was causing more domestic disputes to result in death, and that it was likely Belcher and his girlfriend would not have died had he not possessed a gun. Critics interpreted his remarks as support for gun control. Many (including former Republican presidential candidates Mike Huckabee and Herman Cain) felt Costas should not have used a program typically viewed as entertainment to publicize political views on sensitive topics. Lou Dobbs criticized his remarks for supporting the abolition of the Second Amendment by quoting a sports writer, while Andrew Levy remarked that he had been given a civics lecture by someone who had "gotten rich thanks in part to a sport that destroys men's bodies and brains". However, reporter Erik Wemple of "The Washington Post" praised Costas for speaking out for gun control on the broadcast, commenting that the incident's connection to the NFL provided him with an obligation to acknowledge the incident during the halftime show, stating that "the things that [NFL players] do affect the public beyond whether their teams cover the point spread. And few cases better exemplify that dynamic as powerfully as the Belcher incident." During the following week, Costas defended his remarks in an appearance on MSNBC's program "The Last Word with Lawrence O'Donnell", where he said the remarks were related to the country's gun culture, and not about gun control as critics had inferred. Costas did suggest that more regulation be placed on America's gun culture: Now, do I believe that we need more comprehensive and more sensible gun control legislation? Yes I do. That doesn't mean repeal the Second Amendment. That doesn't mean a prohibition on someone having a gun to protect their home and their family. It means sensible and more comprehensive gun control legislation. But even if you had that, you would still have the problem of what Jason Whitlock wrote about, and what I agree with. And that is a gun culture in this country. 2014 Winter Olympics. During his coverage of the 2014 Winter Olympics, Costas was criticized by some conservative members of the media, including Michelle Malkin and Glenn Beck, for allegedly praising Russian president Vladimir Putin's role in defusing tensions surrounding Syria and Iran. Other conservative media commentators, including Bill O'Reilly and Bernard Goldberg, defended Costas's remarks as factually correct and pointed out that Costas had also voiced considerable criticism of both Russia and Putin while broadcasting from Sochi. During an interview on Fox News, Goldberg said "...the idea that Costas somehow portrayed Vladimir Putin as a benign figure is ridiculous." Costas defended himself on O'Reilly's broadcast on March 3, reiterating that he criticized Putin immediately preceding and following the statements that were questioned. O'Reilly then aired a portion of an Olympic commentary in which Costas was pointedly critical of the Russian leader. Costas also indicated that Senator John McCain, who had been among those who had initially criticized Costas, had called Costas to apologize after hearing the full segment in context. Football's future. While visiting the University of Maryland in November 2017 for a roundtable discussion on various sports topics, Costas said the sport of football was in a decline, with evidence mounting that the repetition of concussions "destroys people's brains" and he would not allow a son with athletic talent to play it. Costas had been scheduled to work Super Bowl LII, his eighth as a host (despite stepping down from "Football Night in America" in favor of his successor Mike Tirico, Costas was to return while Tirico prepped to lead NBC's coverage of the 2018 Winter Olympics, set to begin a few days later). However, the network announced shortly before the game that Liam McHugh would instead join Dan Patrick as a co-host, leading to speculation that NBC removed Costas from the NFL's biggest game over his comments. Costas originally denied such, saying it made more sense for McHugh, who had been hosting Thursday night games on NBC, to serve in that capacity. However, he later admitted in an interview with ESPN's "Outside the Lines" that the comments were indeed the basis of his removal, ultimately resulting in his departure from the network after forty years. Personal life. Costas was married from 1983 to 2001 to Carole "Randy" Randall Krummenacher. They had two children, son Keith (born 1986) and daughter Taylor (born 1989). Costas once jokingly promised Minnesota Twins center fielder Kirby Puckett that, if he was batting over .350 by the time his child was born, he would name the baby Kirby. Kirby was hitting better than .350, but Bob's son initially was not given a first (or second) name of Kirby. After Puckett reminded Costas of the agreement, the birth certificate was changed to "Keith Michael Kirby Costas". On March 12, 2004, Costas married his second wife, Jill Sutton. Costas and his wife now reside primarily in Newport Beach, California. Although Costas was born and raised in the New York area, he has often said he thinks of St. Louis as his hometown. Costas's children have also won Sports Emmys: Keith has won two as an associate producer on MLB Network's "MLB Tonight", and Taylor as an associate producer on NBC's coverage of the 2012 Summer Olympics. In popular culture. Films. In 1994, Costas appeared as the play-by-play announcer for the World Series (working alongside Tim McCarver) in the movie "The Scout". In 1998, he appeared as himself along with his rival/counterpart Al Michaels from ABC in the movie "BASEketball". Costas voiced an animated car version of himself named Bob Cutlass in the movies "Cars" (2006) and "Cars 3" (2017). He also appeared as himself in the 2001 movie "Pootie Tang", where he remarks that he saw "the longest damn clip ever". Costas's voice appeared in the 2011 documentary film "Legendary: When Baseball Came to the Bluegrass", which detailed the humble beginnings of the Lexington Legends, a minor league baseball team located in Lexington, Kentucky. In 2021, Costas played himself in "Here Today" directed by Billy Crystal. Popular culture. Costas has been alluded to several times in popular music. The songs "Mafioso" by Mac Dre, "We Major" by Domo Genesis and "The Last Huzzah" by Mr. Muthafuckin' eXquire, all refer to Costas. He was also mentioned in a Ludacris song after Costas mentioned the rapper on the late night talk show "Last Call with Carson Daly". In June 2013, Costas provided the voice of God in the Monty Python musical "Spamalot" at The Muny Repertory in St. Louis. Television guest roles. Apart from his normal sportscasting duties, Costas has also presented periodic sports blooper reels, and announced dogsled and elevator races, on "Late Night with David Letterman". In 1985, Costas appeared on "The War to Settle the Score", a pre-"WrestleMania" program that the World Wrestling Federation aired on MTV. In 1993, Costas hosted the "pregame" show for the final episode of "Cheers". Costas once appeared on the television program "NewsRadio" as himself. He hosted an award show and later had some humorous encounters with the crew of WNYX. He also had a recurring guest role as himself on the HBO series "Arli$$". Costas has been impersonated several times by Darrell Hammond on "Saturday Night Live". Costas was "supposed" to appear in the fourth-season premiere of "Celebrity Deathmatch" (ironically titled "Where is Bob Costas?") as a guest-commentator, but about halfway through the episode it was revealed that John Tesh had killed him before the show to take his place. This was likely in response for Tesh not being invited back to NBC for its gymnastics coverage at the 2000 Olympics. In 1999, Costas appeared as a guest on "Space Ghost Coast to Coast" during its sixth season. On June 13, 2008, Costas appeared on MSNBC's commercial-free special coverage of "Remembering Tim Russert (1950–2008)". On January 30, 2009, Costas guest-starred as himself on the television series "Monk" in an episode titled "Mr. Monk Makes the Playoffs"'. He mentions to Captain Stottlemeyer about how Adrian Monk once helped him out of a problem several years ago with regards to a demented cat salesman. He apparently sold Costas a cat that allegedly tried to kill him with a squeeze toy. (In fact when he signs off he says, "The cat was definitely trying to kill me.") Costas guest-voiced as himself in 2010 "Simpsons" episode, "Boy Meets Curl", when Homer and Marge make the U.S. Olympic curling team. Costas also guest-voiced as himself on the "Family Guy" episode "Turban Cowboy" in an interview with Peter after he wins the Boston Marathon by hitting everyone with his car. On February 11, 2010, Stephen Colbert jokingly expressed his desire to stab Costas with an ice pick at the upcoming 2010 Winter Olympics in Vancouver so Colbert could take over as host. Costas later made a cameo appearance on the February 25, 2010, edition of Colbert's show. In January 2013, Costas appeared as himself in the "Go On" episode "Win at All Costas" with Matthew Perry, wherein Ryan King auditions with him for a TV show. Real footage of Costas from NBC's pregame show before Game5 of the 1994 NBA Finals was used in the second episode of "". Costas appeared on the September 22, 2017, episode of "Real Time with Bill Maher" to discuss issues such as concussions and the role of political activism in professional sports (namely by Colin Kaepernick). Video games. In 2002, Costas was the play-by-play announcer, alongside Harold Reynolds, for "Triple Play 2002" during the ballgame for PlayStation 2 and Xbox. External links.
4896
10689882
https://en.wikipedia.org/wiki?curid=4896
Bamberg
Bamberg (, , ; East Franconian: "Bambärch") is a town in Upper Franconia, Bavaria, Germany, on the river Regnitz close to its confluence with the river Main. Bamberg had 79,000 inhabitants in 2022. The town dates back to the 9th century, when its name was derived from the nearby "" castle. Cited as one of Germany's most beautiful towns, with medieval streets and buildings, the old town of Bamberg with around 2,400 timber houses has been a UNESCO World Heritage Site since 1993. From the 10th century onwards, Bamberg became a key link with the Western Slavic peoples, notably those of Poland and Pomerania. It experienced a period of great prosperity from the 12th century onwards, during which time it was briefly the centre of the Holy Roman Empire. Emperor Henry II was buried in the old town, alongside his wife Kunigunde. The town's architecture from this period strongly influenced that in Northern Germany and Hungary. From the middle of the 13th century onwards, the bishops were princes of the Empire and ruled Bamberg, overseeing the construction of monumental buildings. This growth was complemented by the obtaining of large portions of the estates of the Counts of Meran in 1248 and 1260 by the sea, partly through purchase and partly through the appropriation of extinguished fiefs. Bamberg lost its independence in 1802, following the secularization of church lands, becoming part of Bavaria in 1803. The town was first connected to the German rail system in 1844, which has been an important part of its infrastructure ever since. After a communist uprising took control over Bavaria in the years following World War I, the state government fled to Bamberg and stayed there for almost two years before the Bavarian capital of Munich was retaken by "Freikorps" units (see Bavarian Soviet Republic). The first republican constitution of Bavaria was passed in Bamberg, becoming known as the "Bamberger Verfassung" (Bamberg Constitution). Following the Second World War, Bamberg was an important base for the Bavarian, German, and then American military stationed at Warner Barracks, until closing in 2014. History. During the post-Roman centuries of Germanic migration and settlement, the region later included in the Diocese of Bamberg was inhabited for the most part by Slavs. The town, first mentioned in 902, grew up by the castle "" which gave its name to the Babenberg family. On their extinction, it passed to the Saxon house. The area was Christianized chiefly by the monks of the Benedictine Fulda Abbey, and the land was under the spiritual authority of the Diocese of Würzburg. In 1007, Holy Roman Emperor Henry II made Bamberg a family inheritance, the seat of a separate diocese. The Emperor's purpose in this was to make the Diocese of Würzburg less unwieldy in size and to give Christianity a firmer footing in the districts of Franconia, east of Bamberg. In 1008, after long negotiations with the Bishops of Würzburg and Eichstätt, who were to cede portions of their dioceses, the boundaries of the new diocese were defined, and Pope John XVIII granted the papal confirmation in the same year. Henry II ordered the building of a new cathedral, which was consecrated on 6 May 1012. The church was enriched with gifts from the pope, and Henry had it dedicated in honor of him. In 1017, Henry founded Michaelsberg Abbey on the Michaelsberg ("Mount St Michael"), near Bamberg, a Benedictine abbey for the training of the clergy. The emperor and his wife, Kunigunde, gave large temporal possessions to the new diocese, and it received many privileges out of which grew the secular power of the bishop. Pope Benedict VIII visited Bamberg in 1020 to meet Henry II for discussions concerning the Holy Roman Empire. While he was there, he placed the diocese in direct dependence on the Holy See. He also personally consecrated some of Bamberg's churches. For a short time, Bamberg was the centre of the Holy Roman Empire. Henry and Kunigunde were both buried in the cathedral. From the middle of the 13th century onwards, the bishops were princes of the Empire and ruled Bamberg, overseeing the construction of monumental buildings. In 1248 and 1260, the see obtained large portions of the estates of the Counts of Meran, partly through purchase and partly through the appropriation of extinguished fiefs. The old Bishopric of Bamberg was composed of an unbroken territory extending from Schlüsselfeld in a northeasterly direction to the Franconian Forest, and possessed in addition estates in the Duchies of Carinthia and Salzburg, in the Nordgau (the present Upper Palatinate), in Thuringia, and on the Danube. By the changes resulting from the Reformation, the territory of this see was reduced by nearly one half in extent. Since 1279 the coat of arms of the city of Bamberg is known in the form of a seal. The witch trials of the 17th century claimed about one thousand victims in Bamberg, reaching a climax between 1626 and 1631, under the rule of Prince-Bishop Johann Georg II Fuchs von Dornheim. The famous "Drudenhaus" (witch prison), built in 1627, is no longer standing today; however, detailed accounts of some cases, such as that of Johannes Junius, remain. In 1647, the University of Bamberg was founded as ". Bambrzy (") are German Poles who are descended from settlers from the Bamberg area who settled in villages around Poznań in the years 1719–1753. In 1759, the possessions and jurisdictions of the diocese situated in Austria were sold to that state. When the secularization of church lands took place (1802) the diocese covered and had a population of 207,000. Bamberg thus lost its independence in 1802, becoming part of Bavaria in 1803. The free state of Bavaria and the Federal Republic of Germany gave protections to Bamberg, though the city does handle its own management of properties. Bamberg was first connected to the German rail system in 1844, which has been an important part of its infrastructure ever since. After a communist uprising took control over Bavaria in the years following World War I, the state government fled to Bamberg and stayed there for almost two years before the Bavarian capital of Munich was retaken by "Freikorps" units (see Bavarian Soviet Republic). The first republican constitution of Bavaria was passed in Bamberg, becoming known as the "Bamberger Verfassung" (Bamberg Constitution). In February 1926 Bamberg served as the venue for the Bamberg Conference, convened by Adolf Hitler in his attempt to foster unity and to stifle dissent within the then-young Nazi party. Bamberg was chosen for its location in Upper Franconia, reasonably close to the residences of the members of the dissident northern Nazi faction but still within Bavaria. During the Bombing of Bamberg, the city was hit a total of nine times by Allied warplanes between 1944 and 1945. While Bamberg was not attacked as badly as nearby Nuremberg, 4.4% of the city ended up being destroyed and 378 civilians died. The biggest and deadliest bombing run happened on 22 February 1945. In the afternoon, American planes attacked the Bamberg railway station and surroundings with bombs. Because of poor visibility, the bombs were also dropped over residential houses, killing a total of 216 civilians and causing many houses between Oberer Stephansberg and Oberer Kaulberg to be damaged or destroyed as a result. The inner city was also hit, particularly in the Obstmarkt, Lange Straße, Grüner Markt and Keßlerstraße. Three significant landmarks in the city were hit: the Erlöserkirche or "Church of the Redeemer" at the Kunigundendamm which was almost completely destroyed (only the tower remained), the historic Altane on the Grüner Markt and the Alte Maut or Old Toll. A follow-up attack was planned for 23 February, but ultimately cancelled due to bad weather. After that, low-flying Allied aircraft continued to attack Bamberg, threatening large gatherings of people and sometimes also dropping leaflets mocking National Socialism and its propaganda. Another 67 people died as a result of these attacks. The city fell with little resistance to American troops on 14 April, despite the use of explosives on all of the bridges to the city by the retreating German forces. After the war had ended, reconstruction efforts began. Geography. Bamberg is located in Franconia, north of Nuremberg by railway and east of Würzburg, also by rail. It is situated on the Regnitz river, before it flows into the Main river. Its geography is shaped by the Regnitz and by the foothills of the Steigerwald, part of the German uplands. From northeast to southwest, the town is divided into first the Regnitz plain, then one large and several small islands formed by two arms of the Regnitz ('), and finally the part of town on the hills, the "Hill Town" ('). The seven hills of Bamberg. Bamberg extends over seven hills, each crowned by a church. This has led to Bamberg being called the "Franconian Rome"—although a running joke among Bamberg's tour guides is to refer to Rome instead as the "Italian Bamberg". The hills are Cathedral Hill, Michaelsberg, Kaulberg/Obere Pfarre, Stefansberg, Jakobsberg, Altenburger Hill and Abtsberg. Climate. Climate in this area has mild differences between highs and lows, and there is adequate rainfall year-round. The Köppen climate classification subtype for this climate is "" (Marine West Coast Climate/Oceanic climate), with a certain continental influence as indicated by average winter nighttime temperatures well below zero. Economy. Bamberg is considered an important economic center and is one of the 15 strongest economic regions in Bavaria compared to the other 95 districts and independent cities. Central locations for industry, manufacturing and trade can be found in the north and east of the city. In the north, around the Bamberg harbor, there is a large industrial area that partly extends into the Hallstadt urban area. Bosch operates its largest plant in Germany in the east and is also the city's most important employer. Relevant economic sectors in Bamberg are the automotive supply industry, electrical engineering and the food industry. The traditional industry of market gardening with large inner-city cultivation areas, which has characterized the city since its beginnings, is still present. Due to its UNESCO World Heritage status and the more than 800,000 overnight guests a year in the city alone, tourism, hotels and gastronomy also play a central role in the city's economy. Key economic figures. In the 2010s, Bamberg's gross domestic product increased by a total of 17.7%. The city of Bamberg generated €5.1 billion in 2021 (€66,543 per capita). In the 2016 Future Atlas, Bamberg was ranked 32nd out of 402 districts, municipal associations and independent cities in Germany, making it one of the places with "very high future prospects". The number of business registrations in the city of Bamberg remains at a high level. In the independent city of Bamberg, the number of business registrations in 2023 was 682, compared to 655 registrations the year before. In 2021, 702 new companies were registered with the city of Bamberg. The number of businesses in the city of Bamberg for 2023 is 2,624 companies. 87 of these businesses employ more than 100 people. Of these, almost half (969) are craft businesses. Labour market and employment figures. As the population in the city has grown, the number of employees has also risen continuously in recent years: Between 2005 and 2022, the number of employees increased by almost 20%. There are around 56,500 people in employment in the city in 2022. The unemployment rate in December 2022 was 4.3%, above the Bavarian average of 3.1%. In the neighboring district of Bamberg, the unemployment rate was 2.3%. Industry and manufacturing. Important sectors of the manufacturing industry in Bamberg are drive technologies and electrical engineering. Bosch has been operating a production facility in Bamberg with over 6000 employees since 1939. The company produces energy, mobility and drive systems, in particular spark plugs for the automotive industry. Due to its dependence on the combustion engine, the company announced in 2019 that it would be switching to fuel cells,[145] although around 1000 jobs are to be transferred to other locations from 2026. Coburg-based automotive supplier Brose has had an administration building with 600 employees on Berliner Ring since 2016. Together with the plant in Hallstadt, the region is the company's second-largest location with more than 2,000 employees. Another important company is Wieland Electric in the field of electrical engineering. The company was founded in Bamberg in 1910 and is considered a pioneer in electrical connection technology and is still the world market leader for pluggable installation technology in the building sector. Rudolf Zimmermann Bamberg (RZB), with over 800 employees worldwide, produces lights and lighting systems near the port of Bamberg. Over 70 companies are located on the 100 hectares of the port. In 2023, over 416,000 tons of goods were loaded at the port, especially foodstuffs and bulk goods. In addition, over 700 river cruise ships docked here. The Bamberg asphalt mixing plant and a Schwenk concrete plant are also located in the immediate vicinity. Bamberg is also home to numerous small and medium-sized companies in other sectors. One special feature here is the centuries-old tradition of instrument making. Organ building, which is currently being continued by master craftsman Thomas Eichfelder, is particularly noteworthy, as is the construction of violins, clarinets and other woodwind instruments. Tourism and retail. Bamberg's city centre is characterized by a diverse retail sector, partly due to tourism. This generated a turnover of 682 million euros in 2023. In the 2018 shopping mile ranking by real estate company Jones Lang LaSalle (JLL), Bamberg was ranked first in Germany for cities with fewer than 100,000 inhabitants. With an average of 5,155 visitors per hour, the Grüner Markt was the 52nd most popular shopping street in Germany. In 2023, the Grüner Markt had almost 8.5 million passers-by, an increase of 130,000 visitors compared to the previous year. Bamberg recorded 807,294 overnight stays in 2023, an increase of 11% compared to 2022. 85% of overnight guests came from Germany, 15% from abroad. The proportion of travelers from abroad rose particularly sharply in 2023 at 22%. The most important tourist countries of origin are the US, the Netherlands, Poland and Austria. Tourism generates around 330 million euros in gross revenue in Bamberg every year – in the hospitality, retail and service sectors. This results in an income of 153 million euros. Attractions. The Town of Bamberg was inscribed on the UNESCO World Heritage List in 1993 due to its medieval layout and its well-preserved historic buildings. Since the Middle Ages, urban gardening has been practiced in Bamberg. The Market Gardeners’ District together with the City on the Hills and the Island District is an integral part of the World Heritage site. In 2005, the Municipality established a unit to coordinate the implementation of the World Heritage Convention in Bamberg. In 2019, a visitor and interpretation centre opened for the World Heritage site. Some of the main sights are: Bamberg Cathedral is a late Romanesque building with four towers. It was founded in 1004 by Emperor Henry II, finished in 1012 and consecrated on 6 May 1012. It was later partially destroyed by fire in 1081. The new cathedral, built by Saint Otto of Bamberg, was consecrated in 1111 and in the 13th century received its present late-Romanesque form. The cathedral is long, wide, high, and the four towers are each about high. It contains many historic works of art, such as the marble tomb of the founder and his wife, considered one of the greatest works of the sculptor Tilman Riemenschneider, and carved between 1499 and 1513. Another treasure of the cathedral is an equestrian statue known as the Bamberg Horseman (""). This statue, possibly depicting the emperor Conrad III, most likely dates to the second quarter of the 13th century. The statue also serves as a symbol of the town of Bamberg. The ' (New Residence) (1698–1704) was initially occupied by the prince-bishops, and from 1864 to 1867 by the deposed King Otto of Greece. Its ' (Rose Garden) overlooks the town. It has over 4500 roses. The is located on the highest of Bamberg's seven hills. It was mentioned for the first time in 1109. Between 1251 and 1553 it was the residence of Bamberg's bishops. Destroyed in 1553 by Albert Alcibiades, Margrave of Brandenburg-Kulmbach, it was used after scant repairs only as a prison, and increasingly fell into decay. In 1801, A. F. Marcus bought the castle and completely repaired it. His friend, the famous German writer E.T.A. Hoffmann, who was very impressed by the building, lived there for a while. The next owner, Anton von Greifenstein, in 1818 founded an association to preserve the castle. This society still maintains the entire property today. The Altenburg today houses a restaurant. Other churches are the ', an 11th-century Romanesque basilica; the '; the ' or ' (1320–1387), which has now been restored to its original pure Gothic style. The ', 12th century Romanesque (restored), on the Michaelsberg, was formerly the church of the Benedictine Michaelsberg Abbey secularized in 1803 and now contains the ', or almshouse, and the museum and municipal art collections. Of the bridges connecting the sections of the lower town the ' was completed in 1455. Halfway across this, on an island, is the ' or town hall (rebuilt 1744–1756). The lyceum, formerly a Jesuit college, contains a natural history museum. The old palace ("") was built in 1591 on the site of an old residence of the counts of Babenberg. Monuments include the Maximilian fountain (1880), with statues of King Maximilian I of Bavaria, the emperor Henry II and his wife, Conrad III and Saint Otto, bishop of Bamberg. There are also tunnels beneath the town. These were originally constructed as mines which supplied sandstone which could be used for construction or as an abrasive cleaner. Mining came to an end in 1920 but a tunnel network remained. The tunnels were used as an air raid shelter during World War II. A part of the network can be visited on a guided tour. Beer. Bamberg is known for its smoked Rauchbier and is home to 11 breweries, including , , Brauerei Heller-Trum (Schlenkerla), Brauerei Kaiserdom, , Klosterbräu, , , Gasthausbrauerei Ambräusianum, Kron Prinz, and Weyermann Röstmalzbierbrauerei. Weyermann Specialty Malting, founded in Bamberg in 1879, supplies breweries around the world. Every August there is a five-day "", a kirmess celebrated with beers. The Franconia region surrounding Bamberg is home to more than 200 breweries. In October and early November many of the 70 breweries in and around Bamberg celebrate Bockbieranstiche with special releases of Bock beer. Education. The University of Bamberg, named Otto-Friedrich University, offers higher education in the areas of social science, business studies and the humanities, and is attended by more than 12,000 students. The University of Applied Sciences Bamberg offers higher education in the areas of public health. Bamberg is also home to eight secondary schools (gymnasiums): There are also numerous other institutes for primary, secondary, technical, vocational and adult education. Infrastructure. Transport. Railway. The InterCityExpress main line No. 28 (Munich – Nuremberg – Leipzig – Berlin / – Hamburg) and the main line No. 18 (Munich – Nuremberg – Halle – Berlin / – Hamburg) run on the Nuremberg–Bamberg and the Bamberg–Hof lines through the Bamberg station. It takes less than two hours to Munich on the train and with the Nuremberg–Erfurt high-speed railway through the Thuringian mountains finished in 2017 less than three hours to Berlin. Two intercity trains of line no. 17 (Vienna – Warnemünde) and line no. 61 (Leipzig – Nuremberg – Karlsruhe) also run through Bamberg. East-west connections are poorer. Bamberg is connected to other towns in eastern Upper Franconia such as Bayreuth, Coburg, and Kronach via the Bamberg–Hof line with trains usually running at least every hour. Connections on the Würzburg–Bamberg line to the west are hourly regional trains to Würzburg, which is fully connected to the ICE network. Tourists arriving at Frankfurt International Airport can take advantage of the new direct connection from Frankfurt's main station. Motorways. Bamberg is not near any of the major (i.e. single-digit) autobahns. But it is nevertheless well connected to the network in all directions: the A70 from Schweinfurt (connecting to the A7 there) to Bayreuth (connecting to the A9) runs along the northern edge of the town. The A73 on the eastern side of town connects Bamberg to Nuremberg (connecting to the A9) and Thuringia, ending at Suhl. Air transport. Bamberg is served by . Mostly public aircraft operate there. It used to be a military airport. (IATA-Code: ZCD, ICAO-Code: EDQA) It is also possible to charter public flights to and from this airport. Most international tourists who travel by plane arrive at Frankfurt International Airport or Munich Airport. The nearest major airport is Nuremberg Airport which can be reached within 45mins by car or one hour by train and subway. Water transport. Both the Rhine-Main-Danube Canal and its predecessor, the Ludwig Canal, begin near Bamberg. The Ludwig Canal was opened in 1846 but closed in 1950 after damage during the second world war. With the completion of the Rhine-Main-Danube Canal in 1992, uninterrupted water transport was again made possible between the North Sea and the Black Sea. Local public transport. Local public transport within Bamberg relies exclusively on buses. More than 20 routes connect the outlying quarters and some villages in the vicinity to the central bus station. In addition, there are several "Night Lines" (the last of these, though, tend to run around midnight) and some park-and-ride lines from parking lots on the periphery to the town centre. A short-lived tram system existed in the 1920s. Military bases. Bamberg was an important base for the Bavarian, German, and then American military stationed at Warner Barracks. Warner Barracks was closed in the fall of 2014, with the last battalion leaving being the 54th Engineer Battalion, and the grounds returned to the German government. In 2016, a large part of the facility was taken over by the German Federal Police for training purposes. Muna Kasserne was a small base occupied by the 504th Maintenance Company, 71st Maintenance Bn. It was part of Warner Barracks although located separately. Governance. Bamberg is an urban district, or "kreisfreie Stadt". Its town council ("Stadtrat") and its mayor ("Oberbürgermeister") are elected every six years, though not in the same year. Thus, the last municipal election for the town council was in 2014, for the mayor in 2012. As an exception to the six-year term, the term starting in 2012 will take eight years to synchronize the elections with those in the rest of Bavaria. As of the elections of 16 March 2014, the 44 member strong town council comprises 12 CSU councillors, 10 SPD councillors, 8 Green councillors, 4 councillors of the "Bamberger Bürger-Block" and 4 of the "Freie Wähler" (Free Voters), both local political movements. These five parties achieved the number of councillors necessary to form a parliamentary group. In addition, there are 3 councillors of the "Bamberger Unabhängige Bürger" and the 1 councillor each of the "Bamberger Realisten", the FDP and the "Bamberger Linke Liste". The previous council, elected on 2 March 2008, was composed of 15 CSU councillors, 10 SPD councillors, 7 Green councillors, 5 councillors of the Bamberger Bürger-Block and 3 of the Freie Wähler (Free Voters), both local political movements. These five parties achieved the number of councillors necessary to form a parliamentary group. In addition, there were 2 councillors of the "Bamberger Realisten" and one of the FDP and the Republikaner, making them ineligible for caucus status. Twin towns – sister cities. Bamberg is twinned with:
4904
7903804
https://en.wikipedia.org/wiki?curid=4904
Bill Mumy
Charles William Mumy Jr. (; born February 1, 1954) is an American actor, writer, producer, and musician. He came to prominence in the 1960s as a child actor whose work included television appearances on "Bewitched", "I Dream of Jeannie", "The Twilight Zone", "Alfred Hitchcock Presents", and a role in the film "Dear Brigitte", followed by a three-season role as Will Robinson in the 1960s sci-fi series "Lost in Space". Mumy later appeared as lonely teenager Sterling North in the film "Rascal" (1969) and Teft in the film "Bless the Beasts and Children" (1971). In the 1990s, Mumy performed the role of Lennier in all five seasons of the sci-fi TV series "Babylon 5" and narrated the Emmy Award–winning series "Biography". Mumy is also a guitarist, singer, songwriter, and composer. He is an Emmy nominee for original music in "Adventures in Wonderland" (1992). As a musician, Mumy performs as a solo artist, an occasional guest performer, and formerly as half of the duo Barnes & Barnes before bandmate Robert Haimer died in 2023. From 1988 through the 1990s he performed at San Diego Comic-Con and other comics-related events as part of the band Seduction of the Innocent (named after the titular book by Fredric Wertham). The band released one CD, "The Golden Age". Early life and career. Mumy was born in San Gabriel, California, to Charles William Mumy, a cattle rancher, and Muriel Gertrude Mumy (). He began his professional career at age seven and has worked on more than four hundred television episodes, eighteen films, various commercials, and scores of voiceover projects. He has also worked as a musician, songwriter, recording artist, and writer. Television and film career. Among Mumy's earliest television roles was six-year-old Willy in the "Donald's Friend" (1960) episode of the NBC-TV family drama series "National Velvet", starring Lori Martin. He starred in three episodes of CBS-TV's original "Twilight Zone": "It's a Good Life" (S3 E8 November 1961), as six-year-old Anthony, who terrorizes his town with psychic powers (a role he later reprised along with his daughter Liliana in the "It's Still a Good Life" episode of the second revival series); "In Praise of Pip" (September 1963), as a vision of Jack Klugman's long-neglected dying son; and "Long Distance Call" (March 1961) as Billy Bayles, who talks to his dead grandmother through a toy telephone. In 1961, Mumy was cast on CBS-TV's "Alfred Hitchcock Presents" series in "The Door Without a Key", featuring John Larch, who played his father in "It's a Good Life". The same year, Mumy starred as little Jackie in the episode "Bang! You're Dead", featuring Marta Kristen, who later played his sister Judy on "Lost in Space". Mumy was cast as Mark Murdock in the "Keep an Eye on Santa Claus" (1962) episode of the ABC-TV drama series "Going My Way", starring Gene Kelly. His fellow guest stars were Cloris Leachman (who played his mother in "It's a Good Life"), Steve Brodie, and Frank McHugh. At age eight, Mumy appeared in Jack Palance's ABC-TV circus drama "The Greatest Show on Earth" (1963); he was cast as Miles, a parentless boy, in the "Perry Mason" episode "The Case of the Shifty Shoebox" (1963), and he portrayed Freddy in the "End of an Image" (1963) episode of NBC-TV's modern Western series "Empire", starring Richard Egan. In 1964, he was cast as Richard Kimble's nephew in ABC-TV's "The Fugitive" episode, "Home Is the Hunted"; as Barry in the NBC-TV medical drama "The Eleventh Hour" episode "Sunday Father"; as himself three times in the ABC sitcom "The Adventures of Ozzie and Harriet"; in the Disney film "For the Love of Willadena"; and as a troubled orphan taken in by the Stephenses in the "Bewitched" fantasy sitcom episode "A Vision of Sugarplums" (December 1964), on ABC-TV. Mumy was reportedly the first choice to portray Eddie Munster in the 1964 CBS situation comedy "The Munsters", but his parents objected to the extensive makeup requirements. The role instead went to Butch Patrick. Mumy appeared in one episode as a friend of Eddie. Mumy guest starred in an episode of NBC-TV's "I Dream of Jeannie", "Whatever Became of Baby Custer?" (1965). That same year, he also appeared in an episode of "Bewitched" titled "Junior Executive" (1965), in which he played a young Darrin Stephens. Mumy starred in "Dear Brigitte" (1965), a film adaptation of the novel "Erasmus with Freckles", as Erasmus Leaf, a child mathematical genius who develops a crush on Brigitte Bardot (played by herself in the film). His parents, played by James Stewart and Glynis Johns, attempt to manage his obsession. "Lost in Space" and beyond. From 1965 to 1968, Mumy portrayed Will Robinson in "Lost in Space", the recipient of numerous warnings (including "Danger, Will Robinson") from the show's robot character, voiced by Dick Tufeld. Mumy was later cast in "Bless the Beasts and Children" (1971) as Teft, a leader in a group of misfit teenage boys resolved to save a herd of bison from hunters. He also played a musician friend of Cliff DeYoung's character in the television film "Sunshine" (1973), later reprising the role in "Sunshine Christmas" and in the TV series "Sunshine". In 1974, Mumy played Nick Butler in the pilot episode of NBC's "The Rockford Files" and made an appearance in a later episode in season 1 as a sidewalk artist. In 1988, he played Ben Matlock's genius nephew, Dr. Irwin Bruckner, on "Matlock". In 1996, Mumy was a writer and co-creator of "Space Cases", a Nickelodeon television show with themes similar to those of "Lost in Space". Between 1994 and 1998, he played the ambassadorial aide Lennier in the syndicated science fiction series "Babylon 5". In November 1998, he played Kellin, a Starfleet officer, in the "" episode "The Siege of AR-558", in which he assists in defeating a Jem'Hadar detachment. To Mumy's delight, his character was human this time due to makeup time and his distaste as being known as an "alien actor"; while playing Lennier in "Babylon 5", he was required to wear prosthetic makeup. Mumy later appeared in a 2006 episode of "Crossing Jordan" and in the Syfy original film "A.I. Assault". In 2018, Mumy appeared in the pilot episode of the Netflix remake series "Lost in Space". His character's name is Dr. Z. Smith, in homage to the character played by Jonathan Harris in the 1965 television series. Voice acting career. Mumy has narrated over 50 episodes of the Arts & Entertainment Network's "Biography" series, as well as hosted and narrated several other documentaries and specials for A&E, Animal Planet, The Sci-Fi Channel, and E!. He has also worked on animated shows such as "Ren and Stimpy", "Scooby-Doo", "", Steven Spielberg's "Animaniacs", "Bravest Warriors", "The Oz Kids" and Disney's "Buzz Lightyear of Star Command" and "Doc McStuffins". Mumy's work also includes voice overs in national commercials for Bud Ice, Farmers Insurance, Ford, Blockbuster, Twix, Oscar Mayer and McDonald's. Music. Mumy plays the banjo, bass, guitar, harmonica, keyboards, mandolin, and percussion. His various musical credits include songs he has written and recorded with America, performed on tour with Shaun Cassidy, and played with Rick Springfield's band in the film "Hard to Hold". He created the band The Be Five with other "Babylon 5" actors. He and Marta Kristen performed "Sloop John B" in the "Lost in Space" S3E14 episode "Castles In Space". Mumy has released a number of solo CDs, including "Dying to Be Heard", "In the Current", "Pandora's Box", "After Dreams Come True", "Los Angeles Times" and "Ghosts", as well as nine albums with music partner Robert Haimer as Barnes and Barnes. Their most famous hit is the song "Fish Heads", which "Rolling Stone" named one of the top 100 videos of all time. He also performs with the Jenerators, a blues-rock band based in Los Angeles featuring Tom Hebenstreit on vocals, electric guitars, and keyboards; Mumy on vocals, acoustic and electric guitars, harmonica, keyboards and percussion; Gary Stockdale on vocals and bass; Miguel Ferrer on vocals, percussion and drums; David Jolliffe on guitar, percussion and vocals; and Chris Ross on drums and percussion. Additionally, Mumy released a Byrds tribute song, "When Roger Was Jim" (2012). In 2017, along with John Cowsill (The Cowsills) and Vicki Peterson (The Bangles) he founded the Action Skulls. Their first CD, "Angels Hear", which also included posthumous contributions from the bassist Rick Rosas, was released on September 27, 2017. Mumy produces and hosts "The Real Good Radio Hour", a weekly series on KSAV Internet Radio focusing on various styles of music and the artists who pioneered them. "Lost in Space" activities in later years. In a 2010 interview on "Blog Talk Radio"'s "Lessons Learned", Rick Tocquigny was asked if Mumy was a Jonathan Harris fan before they appeared together on "Lost in Space". Tocquigny stated that when Mumy was five years old, he was too young to watch his mentor's show "The Third Man", which would have been aired late at night, but he was old enough to see "The Bill Dana Show". On June 14, 2006, Mumy got to work with Harris one last time, but posthumously. Years before Harris died, he recorded voice work for the animated short "The Bolt Who Screwed Christmas," narrating the film and playing the part of The Bolt. As a tribute to Harris, writer-director John Wardlaw added a scene that reunited "Lost in Space" cast members Mumy, Marta Kristen, and Angela Cartwright as the animated Ratchett family. Mumy appears in two episodes of the 2018 series "Lost in Space" on Netflix. He plays the role of Dr. Smith, whose identity is stolen by June Harris, the villain. Mumy attends "Lost in Space" reunions and shows, and with Angela Cartwright, he co-authored a 2015 book, "Lost (and Found) in Space". He and Cartwright co-authored the 2021 book, "Lost (and Found) in Space 2: Blast Off into the Expanded Edition". Other work. Mumy and co-author Peter David published a short story, "The Black '59" (1992), in the anthology "Shock Rock", edited by F. Paul Wilson. He has also written a number of comics. With his friend, the late Miguel Ferrer, Mumy created Comet Man and Trypto the Acid Dog. They also co-wrote the Marvel Graphic Novel "The Dreamwalker". Personal life. Mumy married Eileen Joy Davis on October 9, 1986. They live in Laurel Canyon, Los Angeles with their two children, Seth and Liliana.
4905
1300096926
https://en.wikipedia.org/wiki?curid=4905
House of Bonaparte
The House of Bonaparte (originally "Buonaparte") is a former imperial and royal European dynasty of French and Italian origin. It was founded in 1804 by Napoleon I, the son of Corsican nobleman Carlo Buonaparte and Letizia Buonaparte (née Ramolino). Napoleon was a French military leader who rose to power during the French Revolution and who, in 1804, transformed the French First Republic into the First French Empire, five years after his "coup d'état" of November 1799 (18 Brumaire). Napoleon and the "Grande Armée" had to fight against every major European power (except for the ones he was allied with, including Denmark-Norway) and dominated continental Europe through a series of military victories during the Napoleonic Wars. He installed members of his family on the thrones of client states, expanding the power of the dynasty. The House of Bonaparte formed the Imperial House of France during the French Empire, together with some non-Bonaparte family members. In addition to holding the title of Emperor of the French, the Bonaparte dynasty held various other titles and territories during the Napoleonic Wars, including the Kingdom of Italy, the Kingdom of the Spain and the Indies, the Kingdom of Westphalia, the Kingdom of Holland, and the Kingdom of Naples. The dynasty held power for around a decade until the Napoleonic Wars began to take their toll. Making very powerful enemies, such as Austria, Britain, Russia, and Prussia, as well as royalist (particularly Bourbon) restorational movements in France, Spain, the Two Sicilies, and Sardinia, the dynasty eventually collapsed due to the final defeat of Napoleon I at the Battle of Waterloo and the restoration of the Bourbon dynasty by the Congress of Vienna. During the reign of Napoleon I, the Imperial Family consisted of the Emperor's immediate relations – his wife, son, siblings, and some other close relatives, namely his brother-in-law Joachim Murat, his uncle Joseph Fesch, and his stepson Eugène de Beauharnais. Between 1852 and 1870, there was a Second French Empire, when a member of the Bonaparte dynasty again ruled France: Napoleon III, the youngest son of Louis Bonaparte. However, during the Franco-Prussian War of 1870–1871, the dynasty was again ousted from the Imperial Throne. Since that time, there has been a series of pretenders. Supporters of the Bonaparte family's claim to the throne of France are known as Bonapartists. Current head Jean-Christophe, Prince Napoléon has a Bourbon mother. Italian origins. The Bonaparte (originally ) family were patricians in the Italian towns of Sarzana, San Miniato, and Florence. The name derives from Italian: "buona" ("good") and "parte" ("part" or "side"). In Italian, the phrase "buona parte" is used to identify a fraction of considerable, but undefined, size in a . Gianfaldo Buonaparte was the first known Buonaparte at Sarzana around 1200. His descendant Giovanni Buonaparte in 1397 married Isabella Calandrini, a cousin of later cardinal Filippo Calandrini. Giovanni became mayor of Sarzana and was named commissioner of the Lunigiana by Giovanni Maria Visconti in 1408. His daughter, Agnella Berni, was the great-grandmother of Italian poet Francesco Berni and their great-grandson Francesco Buonaparte was an equestrian mercenary at the service of the Genoese Bank of Saint George. In 1490, Francesco Buonaparte went to the island of Corsica, which was controlled by the bank. In 1493, he married the daughter of Guido da Castelletto, representative of the Bank of Saint George in Ajaccio, Corsica. Most of their descendants during subsequent generations were members of the Ajaccio town council. Napoleon's father, Carlo Buonaparte, received a patent of nobility from the King of France in 1771. There also existed a Buonaparte family in Florence; however, its eventual relation with the Sarzana and San Miniato families is unknown. Jacopo Buonaparte of San Miniato was a friend and advisor to Medici Pope Clement VII. Jacopo was also a witness to and wrote an account of the sack of Rome, which is one of the most important historical documents recounting that event. Two of Jacopo's nephews, Pier Antonio Buonaparte and Giovanni Buonaparte, however, took part in the 1527 Medici rebellion, after which they were banished from Florence and later were restored by Alessandro de' Medici, Duke of Florence. Jacopo's brother Benedetto Buonaparte maintained political neutrality. The San Miniato branch extinguished with Jacopo in 1550. The last member of the Florence family was a canon named Gregorio Bonaparte, who died in 1803, leaving Napoleon as heir. A Buonaparte tomb lies in the Church of San Francesco in San Miniato. A second tomb, the "Chapelle Impériale", was built by Napoleon III in Ajaccio 1857. Imperial House of France. In 1793 Corsica formally seceded from France and sought protection from the British government, prompting Pasquale Paoli to compel the Bonapartes to relocate to the mainland. Napoleon I is the most prominent name associated with the Bonaparte family because he conquered much of Europe during the early 19th century. Due to his indisputable popularity in France both among the people and in the army, he staged the Coup of 18 Brumaire and overthrew the Directory with the help of his brother Lucien Bonaparte, president of the Council of Five Hundred. Napoleon then oversaw the creation of a new Constitution that made him the First Consul of France on 10 November 1799. On 2 December 1804, he crowned himself Emperor of the French and ruled from 1804 to 1814, and again in 1815 during the Hundred Days after his return from Elba. Following his conquest of most of Western Europe, Napoleon I made his elder brother Joseph first King of Naples and then of Spain, his younger brother Louis King of Holland (subsequently forcing his abdication in 1810 after his failure to subordinate Dutch interests to those of France), and his youngest brother Jérôme as King of Westphalia, a short-lived realm created from several states of northwestern Germany. Napoleon's son Napoléon François Charles Joseph was made King of Rome and was later styled as Napoleon II by loyalists of the dynasty, though he only ruled for two weeks after his father's abdication. Louis-Napoléon, son of Louis, was President of France and then Emperor of the French from 1852 to 1870, reigning as Napoleon III. His son, Napoléon, Prince Imperial, died fighting the Zulus in Natal, today the South African province of KwaZulu-Natal. With his death, the family lost much of its remaining political appeal, though claimants continue to assert their right to the imperial title. A political movement for Corsican independence surfaced in the 1990s which included a Bonapartist restoration in its programme. Family tree. Family tree list. "Note: Bold for common names"<br>Carlo-Maria (Ajaccio, 1746–Montpellier, 1785) married Maria Letizia Ramolino (Ajaccio, 1750–Rome, 1836) in 1764. He was a minor official in the local courts. They had eight children: Bonaparte coat of arms. The arms of the Bonaparte family were: "Gules two bends sinister between two mullets or". In 1804, Napoleon I changed the arms to "Azure an imperial eagle or". The change applied to all members of his family except for his brother Lucien and his nephew, the son from Jérôme's first marriage. DNA research. According to studies by G. Lucotte and his coauthors based on DNA research since 2011, Napoleon Bonaparte belonged to Y-DNA (direct male ancestry) haplogroup E1b1b1c1* (E-M34*). This 15000-year-old haplogroup has its highest concentration in Ethiopia and in the Near East (Jordan, Yemen). According to the authors of the study, "Probably Napoléon also knew his remote oriental patrilineal origins, because Francesco Buonaparte (the Giovanni son), who was a mercenary under the orders of the Genoa Republic in Ajaccio in 1490, was nicknamed "The Maure of Sarzane"." The latest study identifies the common Bonaparte DNA markers from Carlo (Charles) Bonaparte to 3 living descendants. Lucotte et al. published in October 2013 the extended Y-STR of Napoleon I based on descendant testing, and the descendants were E-M34, just like the emperor's beard hair tested a year before. The persons tested were the patrilineal descendants of Jérome Bonaparte, one of Napoleon's brothers, and of Alexandre Colonna-Walewski, Napoleon's illegitimate son with Marie Walewska. These three tests all yielded the same Y-STR haplotype (109 markers) confirming with 100% certainty that the first Emperor of the French belonged to the M34 branch of haplogroup E1b1b. STR strongly suggests that the Bonaparte belong to the Y58897 branch, which means that the ancestor 3000 years ago or a bit more lived in Anatolia, but all relatives in the database with a common ancestor with over 1000 years are found in their own the Massa - La Spezia small area in Italy. There are at the moment no relatives in the database older than that, which means they are very rare in Europe. Living members. Charles, Prince Napoléon (born 1950, great-great-grandson of Jérôme Bonaparte by his second marriage), and his son Jean-Christophe, Prince Napoléon (born 1986 and appointed heir in the will of his grandfather Louis, Prince Napoléon)currently dispute the headship of the Bonaparte family. The only other male members of the family are Charles's recently married (2013) brother, Prince Jérôme Napoléon (born 1957) and Jean-Christophe's son, Prince Louis Napoléon (born 2022). There are no other legitimate descendants in the male line from Napoleon I or from his brothers. There are, however, numerous descendants of Napoleon's illegitimate, unacknowledged son, Count Alexandre Colonna-Walewski (1810–1868), born from Napoleon I's union with Marie, Countess Walewski. A descendant of Napoleon's sister Caroline Bonaparte was actor René Auberjonois. Recent DNA-matches with living descendants of Jérôme and Count Walewski have confirmed the existence of descendants of Lucien Bonaparte, Napoleon's brother, namely the Clovis family.