# Hoffman-London graphs: When paths minimize $H$-colorings among trees
**Authors**: David Galvin, Phillip Marmorino, Emily McMillon, JD Nir, Amanda Redlich
## Hoffman-London graphs: When paths minimize H -colorings among trees
David Galvin ∗ University of Notre Dame dgalvin1@nd.edu
JD Nir Oakland University jdnir@oakland.edu
Phillip Marmorino † Purdue University pmarmori@purdue.edu
Emily McMillon ‡ Rice University em72@rice.edu
Amanda Redlich University of Massachusetts Lowell amanda redlich@uml.edu
January 1, 2026
## Abstract
Given a graph G and a target graph H , an H -coloring of G is an adjacency-preserving vertex map from G to H . The number of H -colorings of G , hom( G,H ), has been studied for many classes of G and H . In particular, extremal questions of maximizing and minimizing hom( G,H ) have been considered when H is a clique or G is a tree.
In this paper, we develop a new technique using automorphisms of H to show that hom( T, H ) is minimized by paths as T varies over trees on a fixed number of vertices. We introduce the term Hoffman-London to refer to graphs that are minimal in this sense. In particular, we define an automorphic similarity matrix which is used to compute hom( T, H ) and give matrix conditions under which H is Hoffman-London.
We then apply this technique to identify several families of graphs that are Hoffman-London, including loop threshold graphs and some with applications in statistical physics (e.g. the Widom-Rowlinson model). By combining our approach with a few other observations, we fully characterize the minimizing trees for all graphs H on three or fewer vertices.
## 1 Introduction
Graph coloring is a well-known problem in which vertices of a graph are assigned a color with the restriction that adjacent vertices receive different colors. In this paper, we consider the related notion of H -colorings of a graph G in which vertices of G are labeled by vertices of H in such a way that adjacent vertices in G must be 'colored' by adjacent vertices of H . Formally, given a simple, loopless graph G = ( V ( G ) , E ( G )) and a target graph H = ( V ( H ) , E ( H )) without multi-edges, but possibly containing loops, an H -coloring of G is a map f : V ( G ) → V ( H ) that preserves adjacency: whenever u ∼ G v we must have f ( u ) ∼ H f ( v ). Note that when H = K q , the complete graph on q vertices, an H -coloring of G is an assignment of the vertices of G to one of q values with the only
∗ Galvin is in part supported by Simons Foundation Gift no. 854277 - Travel Support for Mathematicians.
† Marmorino is in part supported by NSF AGS-10002554.
‡ McMillon is in part supported by NSF DMS-2303380.
restriction that adjacent vertices receive different labels-in other words, the proper q -colorings of G . In this sense, H -colorings are a generalization of proper vertex colorings, though, as we will see, H -colorings generalize several other common graph theoretic notions as well.
While we mainly focus on the framework of H -colorings, it should be noted that an H -coloring of G is just a graph homomorphism from G to H . Fittingly, we denote the set of all H -colorings of G by Hom( G,H ) and use hom( G,H ) to count the number of such colorings.
As mentioned, H -colorings encompass proper q -colorings by allowing H to be a complete graph. By setting H = H ind , which is one looped vertex connected by an edge to an unlooped vertex, we see H -colorings of G also encode the independent (or stable ) sets of G : any collection of vertices mapped to the unlooped vertex contains no internal edges. Lov´ asz [32] investigated connections between H -colorings and graph limits, quasi-randomness, and property testing. In statistical physics, the language of H -coloring has been adopted to describe hard-constraint spin models; see e.g. [1, 2].
This article contributes to a broader investigation into an extremal enumeration question: for a given family of graphs G and a target graph H , can we characterize those G ∈ G which maximize or minimize hom( G,H )? The origins of this question are attributed to Birkhoff's work on the 4-color conjecture [3, 4] with more recent attention coming from questions of Wilf [42] and Linial [30] concerning which graph on n vertices and m edges admits the most proper q -colorings, which is to say maximizes hom( G,K q ). For further results and conjectures, we direct the reader to the surveys [10, 45].
Here we consider the case where G is a family of trees; in particular we consider the family of graphs G = T n , the set of all trees on n vertices. As in many such questions, the path graph P n and star graph S n = K 1 ,n -1 are natural candidates for extremal graphs. When H = H ind , Prodinger and Tichy [34] proved that for each n , the number of independent sets among n vertex trees is maximized by stars and minimized by paths, confirming that for each T n ∈ T n ,
$$\frac { 1 } { m ( T _ { n } , H _ { ind } ) } \leq h o m ( P _ { n } , H _ { ind } ) \leq h o$$
The Hoffman-London inequality (see [25, 31, 39]) is equivalent to the statement that hom( P n , H ) ≤ hom( S n , H ) for any choice of H and n . Sidorenko [40] extended the Hoffman-London inequality to the following very general result:
Theorem 1.1 (Sidorenko [40]) . Fix H and n ≥ 1 . For any T n ∈ T n ,
$$\tan ( \alpha , \beta ) = \frac { C _ { 1 } A _ { 1 } } { B _ { 1 } A _ { 1 } }$$
In other words, for any target graph H and order n , not only P n , but every tree T n admits at most as many H -colorings as S n . Sidorenko's result entirely resolves the maximization question for trees (and, it can be shown, connected graphs).
Given the result of Prodinger and Tichy, one might ask whether paths minimize the number of H -colorings; could one show a result analogous to Sidorenko's that hom( P n , H ) ≤ hom( T n , H ) for all target graphs? Csikv´ ari and Lin [6], following earlier work of Leontovich [29], dash such hopes by describing a target graph H and a tree E 7 on seven vertices in which hom( E 7 , H ) < hom( P 7 , H ). They raise the following natural problem [6, Problem 6.2]:
Problem 1.2. Characterize those H for which hom( P n , H ) ≤ hom( T n , H ) holds for all n and all T n ∈ T n .
Inspired by [25, 31], we introduce the following definitions:
Definition 1.3. We say a target graph H is Hoffman-London if for all n ≥ 1,
$$\frac { 1 } { n } \sum _ { i = 1 } ^ { n } h ( T _ { n } , H ) = \min _ { n \in T _ { n } } h ( P _ { n } , H )$$
If it holds that hom( P n , H ) < hom( T n , H ) for all T n ∈ T n \ { P n } and all n ≥ 1, we say that H is strongly Hoffman-London .
We can thus restate Problem 1.2 as: Which target graphs H are Hoffman-London? To make progress on this problem, we develop criteria for identifying Hoffman-London and strongly HoffmanLondon target graphs based on concepts introduced in a companion paper [19], which we restate here. Given a target graph H , possibly with loops, the orbit partition of H is the partition of V ( H ) obtained by declaring a pair of vertices u, v to be equivalent if there is an automorphism that maps u to v . We say such vertices are automorphically similar . Denote the equivalence classes of the orbit partition, which we call automorphic similarity classes , by H 1 , . . . , H k , where k is the number of automorphism orbits of V ( H ). We will show in Lemma 3.1 that for each 1 ≤ i ≤ k and 1 ≤ j ≤ k , there is a constant m i,j counting the number of neighbors any v ∈ H i has in H j , independent of the specific choice of v . The automorphic similarity matrix of H (relative to the specific ordering of the automorphic similarity classes) is the k × k matrix M = M ( H ) whose ( i, j )-entry is m i,j .
We also make use of the following matrix property, which is related to the idea of stochastic domination.
Definition 1.4. A k × k matrix M with ( i, j )-entry m i,j has the increasing columns property if any terminal segment of columns is non-decreasing: for each 1 ≤ c ≤ k and each 1 ≤ i ≤ k -1,
$$\sum _ { j = c } ^ { k } m _ { i , j } \leq \sum _ { j = c } ^ { k } m _ { i + 1 , j }$$
Our first contribution, captured by Theorem 1.5, is a test that can be applied to the automorphic similarity matrix of a graph H which, when it applies, proves that H is Hoffman-London.
Theorem 1.5. Let H be a target graph, possibly with loops. Suppose there is an ordering of the automorphic similarity classes H 1 , . . . , H k of H such that the automorphic similarity matrix M has the increasing columns property. Then H is Hoffman-London.
In certain cases, the reasoning behind Theorem 1.5 can be extended to show H is strongly Hoffman-London. For a graph G with designated vertex v , let Hom v → i ( G,H ) denote the set of H -colorings of G in which v is mapped to some (arbitrary but specific) representative of H i . We use hom v → i ( G,H ) for the number of such H -colorings. Although Hom v → i ( G,H ) clearly depends on the choice of representative, in Lemma 3.2 we will show that hom v → i ( G,H ) does not.
Corollary 1.6. Let H be a target graph that satisfies the hypotheses of Theorem 1.5. Suppose that for each t ≥ 2 , there exist two classes, H a ( t ) and H b ( t ) , such that there is at least one H -coloring of P t that sends one endvertex of P t to a vertex in H a ( t ) and the other endvertex to a vertex in H b ( t ) . Suppose further that for all s ≥ 2 , it holds that hom v → b ( t ) ( P s , H ) > hom v → a ( t ) ( P s , H ) , where v is one of the end vertices of P s . Then H is strongly Hoffman-London.
In Section 4 we apply Theorem 1.5 to prove (and in some cases, reprove known results) that several families of target graphs are Hoffman-London. These examples include vertex transitive
graphs to which looped dominating vertices are added (partially recovering a result of Engbers and the first author [15]), paths (partially recovering a result of Csikv´ ari and Lin [7]) and fully looped paths, loop threshold graphs, blow-ups of fully looped stars, and families of graphs (including the Widom-Rowlinson model) originating from statistical physics. When possible, we also apply Corollary 1.6 to show that these graphs are strongly Hoffman-London.
## 2 Preliminary observations
In this section, we gather some observations about enumerating H -colorings of trees that will prove useful at various points throughout the paper. We also present a proof that complete bipartite graphs are Hoffman-London.
We begin with two observations pertaining to target graphs with more than one component. Both results follow immediately from the definition of H -coloring.
Observation 2.1. If H has components H 1 , . . . , H k , then for any connected graph G we have
$$h o m ( G , H ) = \sum _ { i = 1 } ^ { k } h o m ( G , H _ { i } )$$
In particular, if it holds that hom( P n , H i ) ≤ hom( T n , H i ) for all i , n , and T n ∈ T n , then it holds that hom( P n , H ) ≤ hom( T n , H ) for all n and T n ∈ T n .
That is, the set of graphs that are Hoffman-London is closed under taking disjoint unions. The second observation follows from the first, but is useful on its own when considering graphs with isolated vertices.
Observation 2.2. If H has isolated vertices and H ′ is obtained from H by removing the isolated vertices, then for each G without isolated vertices we have hom( G,H ) = hom( G,H ′ ). In particular, this means that for the purpose of minimizing H -colorings of trees on two or more vertices, we can ignore isolated vertices.
We now make an observation about regular target graphs. Note that here and throughout this paper, the degree of a vertex v is the number of edges (including loops) that contain v -so loops contribute 1 to the degree, not 2, as is used elsewhere in the literature.
Observation 2.3. If H is regular, then all trees on n vertices admit the same number of H -colorings. Specifically, hom( T, H ) = | V ( H ) | d n -1 , where d is the degree of each vertex in H .
It follows that regular graphs are, trivially, Hoffman-London. To see the validity of Observation 2.3, put an ordering v 1 , . . . , v n on the vertices of T satisfying that for each i ≥ 2, v i is adjacent to exactly one v j , j < i . For example, start with an arbitrary vertex as v 1 , then list all the vertices at distance 1 from v 1 , in some arbitrary order, then move on to all the vertices at distance 2 from v 1 , and so on. When H -coloring the vertices of T in this order, there are | V ( H ) | options for the color at v 1 , then d options for the color of each successive vertex.
Next, we refer to the notion of the tensor product of two graphs. For graphs H 1 and H 2 , the graph H 1 × H 2 has vertex set V ( H 1 ) × V ( H 2 ), and pairs ( x 1 , y 1 ), ( x 2 , y 2 ) are adjacent in H 1 × H 2 if and only if x 1 x 2 ∈ E ( H 1 ) and y 1 y 2 ∈ E ( H 2 ); see Figure 1.
for all n and T n ∈ T n .
That is, the set of graphs that are Hoffman-London is closed under taking tensor products.
Observation 2.5. Let R be a regular graph of degree d with at least one edge. Then for any H and any n ≥ 1, the trees T n in T n that minimize hom( T n , H ) are the same as the trees T n in T n that minimize hom( T n , H × R ).
Observation 2.5 follows from Observation 2.4 and from the fact that hom( T n , R ) = | V ( R ) | d n -1 (see Observation 2.3) and is thus positive and independent of the specific choice of T n ∈ T n .
Finally, we present a result whose justification is more involved. A bipartition V ( G ) = X ∪ Y of a graph is called balanced if the sizes are as equal as possible, i.e. | X | = | Y | if | V ( G ) | is even and ∣ ∣ | X | - | Y | ∣ ∣ = 1 if | V ( G ) | is odd.
̸
Theorem 2.6. Each complete bipartite graph H = K a,b is Hoffman-London. When a = b , T n ∈ T n minimizes hom( T n , H ) if and only if T n has a balanced bipartition.
̸
Proof. If a = b , then H is a regular graph and Observation 2.3 tells us that every tree admits the same number of H -colorings. For the remainder of the proof we assume that a = b .
Any tree T n on n vertices has a unique bipartition, say V ( T ) = X ∪ Y , with | X | = k and | Y | = n -k . Let A be the part of H containing a vertices and B be the part containing b vertices. Let φ ∈ Hom( T n , K a,b ) and x ∈ X . Then, as T n is connected, each vertex in X is adjacent to some vertex in Y and each vertex in Y is adjacent to some vertex in X , so if φ ( x ) ∈ A , we have φ ( X ) ⊆ A and φ ( Y ) ⊆ B , and if φ ( x ) ∈ B , φ ( X ) ⊆ B and φ ( Y ) ⊆ A . Furthermore, as H is complete bipartite, the image of x (and each other vertex in X ) can be any vertex in the appropriate part, and similarly for the vertices of Y , so
$$\sum _ { n = 1 } ^ { \infty } f ( k ) = a ^ { k } b ^ { - k }$$
Figure 1: The tensor product K 2 × K 2 .
<details>
<summary>Image 1 Details</summary>

### Visual Description
## Diagram: Cartesian Product of Two Two-Element Sets
### Overview
The image is a mathematical diagram illustrating the Cartesian product of two sets, each containing two elements. It visually demonstrates how the product operation combines elements from the first set with elements from the second set to form all possible ordered pairs.
### Components/Axes
The diagram is divided into three distinct parts, separated by mathematical operators.
1. **Left Component (First Set):**
* A vertical line segment with two endpoints.
* The top endpoint is labeled `x₁`.
* The bottom endpoint is labeled `x₂`.
2. **Middle Component (Second Set):**
* A vertical line segment with two endpoints, identical in form to the first.
* The top endpoint is labeled `y₁`.
* The bottom endpoint is labeled `y₂`.
* Between the first and second components is a multiplication symbol (`×`), indicating the Cartesian product operation.
3. **Right Component (Resulting Set):**
* An equals sign (`=`) separates the operation from its result.
* Four points are arranged in a square formation, representing the elements of the Cartesian product.
* The top-left point is labeled `(x₁, y₁)`.
* The top-right point is labeled `(x₂, y₁)`.
* The bottom-left point is labeled `(x₁, y₂)`.
* The bottom-right point is labeled `(x₂, y₂)`.
* Two diagonal lines cross in the center, connecting the points:
* One line connects `(x₁, y₁)` to `(x₂, y₂)`.
* The other line connects `(x₂, y₁)` to `(x₁, y₂)`.
### Detailed Analysis
The diagram explicitly maps the formation of the Cartesian product `X × Y`, where `X = {x₁, x₂}` and `Y = {y₁, y₂}`.
* **Process Flow:** The flow is from left to right. The two initial sets are presented, the product operation is applied, and the resulting set of four ordered pairs is displayed.
* **Element Mapping:** The diagonal lines in the result visually emphasize the combinatorial nature of the product. Each line connects pairs that share one common element from the original sets (e.g., the line from `(x₁, y₁)` to `(x₂, y₂)` connects pairs that share no common element, while the other line connects pairs that share either `x₂` or `y₁`).
* **Spatial Grounding:** The labels are placed directly adjacent to their corresponding points. The initial sets are vertically oriented, while the result is a 2x2 grid, highlighting the expansion from two linear sets to a planar set of combinations.
### Key Observations
1. **Complete Enumeration:** The diagram shows all four possible ordered pairs (`2 × 2 = 4`), confirming it is a complete representation of the Cartesian product for these finite sets.
2. **Symmetry:** The diagram is symmetric. The two initial sets are structurally identical, and the resulting grid is symmetric about both the vertical and horizontal axes.
3. **Visual Abstraction:** The lines connecting the initial points (`x₁` to `x₂`, `y₁` to `y₂`) are abstract representations of the sets themselves, not functions or mappings. The diagonal lines in the result are also abstract connectors showing the relationship between the output pairs.
### Interpretation
This diagram is a fundamental visual proof or explanation of the Cartesian product operation in set theory. It demonstrates that the product of two sets is the set of all possible ordered pairs where the first element comes from the first set and the second element comes from the second set.
* **What it Suggests:** The data (the sets and their product) suggests a combinatorial explosion. Starting with two simple sets of two items each, the product creates a new set with four distinct, composite items. This principle scales to larger sets and higher dimensions.
* **Relationships:** The core relationship shown is **combination**. The diagram moves from individual, independent elements (`x₁`, `y₁`) to combined, dependent entities (`(x₁, y₁)`). The crossing lines in the result may also hint at the concept of a **complete bipartite graph** between the two original sets, where every element of X is connected to every element of Y.
* **Notable Anomaly:** There is no anomaly; the diagram is a precise and standard representation of a mathematical definition. Its purpose is pedagogical clarity.
**Language Note:** All text in the image is in English and standard mathematical notation. No other languages are present.
</details>
There is a strong connection between tensor products and H -colorings, namely that for any graphs G , H 1 , and H 2 we have
$$\hom ( G , H _ { 1 } \times H _ { 2 } ) = h$$
(see e.g. [32, Equation (5.30)]). From this fact we draw the following two conclusions:
Observation 2.4. If H 1 , H 2 satisfy hom( P n , H i ) ≤ hom( T n , H i ) for each i ∈ { 1 , 2 } , n ≥ 1, and T n ∈ T n , then
$$\leq h o m ( P _ { n } , H _ { 1 } \times H _ { 2 } )$$
We have
̸
$$f ^ { \prime } ( k ) = \frac { \log ( a ) - \log ( b ) } { ( a ^ { 2 } k b ^ { n } - a ^ { n } b ^ { 2 k } ) . }$$
As a = b , the only zero of f ′ occurs when a n -2 k = b n -2 k , which, for distinct positive integers, requires n -2 k = 0, or k = n 2 . Noting
$$f ^ { \prime } ( k ) = \frac { ( \log ( a ) - \log ( b ) ) ^ { 2 } } { ( a ^ { 2 } k ^ { 2 } + a ^ { n } b ^ { 2 } k ) > 0 , }$$
we see k = n 2 is a minimum of f , and furthermore f is decreasing as k increases from 0 to n 2 . Therefore
$$\min _ { T \in F } ( T , H ) \geq 2 a ^ { n } - b ^ { n }$$
when n is even. When n is odd, since k must be an integer,
$$\frac { 1 } { T _ { c } ^ { n } } \min h o m ( T _ { n } , H ) \geq a ^ { ( n - 1 ) / 2 }$$
In both cases, P n achieves the bound, as does any tree whose bipartition is balanced. If T is a tree whose bipartition is not balanced, f ( k ) is strictly greater than the minimum value.
Csikv´ ari and Lin [7] observed the case a = 1 of Theorem 2.6. The question of whether complete multipartite graphs with more than two parts are Hoffman-London remains open.
## 3 Proofs of Theorem 1.5 and Corollary 1.6
In this section, we prove Theorem 1.5, our primary condition for identifying Hoffman-London graphs H , and Corollary 1.6, our condition for identifying strongly Hoffman-London graphs. Recall from the introduction that if H is a target graph, possibly with loops, and u and v are vertices in H , we say u and v are automorphically similar if there is an automorphism of H that sends u to v . Automorphic similarity defines an equivalence relation on V ( H ), and we call the equivalence classes of that relation the automorphic similarity classes of H . The resulting partition of the vertices is sometimes called the orbit partition . As discussed in [23, Section 9.3], the orbit partition is an equitable partition , which means that for any two (not necessarily distinct) automorphic similarly classes A and B , there is a constant c ( A,B ) such | N ( v ) ∩ B | = c ( A,B ) for each v ∈ A . In other words, vertices in the same automorphic similarity class have the same number of neighbors in each other class. For completeness, and because the proof is brief and illustrative, we include the following lemma (which was first presented in [19]).
Lemma 3.1. Let H be a graph, possibly with loops, and let H 1 , . . . , H k be the automorphic similarity classes of H . For any 1 ≤ i ≤ k , any u, v ∈ H i , and any 1 ≤ j ≤ k ,
$$N a O H ^ { + } = N a O H _ { 2 }$$
Proof. Let φ be an automorphism of V ( H ) that sends u to v . Then each w ∈ N ( u ) ∩ H j is mapped to some w ′ . As w ∼ u , we see w ′ ∼ v , and as w ∈ H j and φ ( w ) = w ′ we see w and w ′ are automorphically similar, so w ′ ∈ H j as well. We have shown | N ( u ) ∩ H j | ≤ | N ( v ) ∩ H j | . Repeating the analysis with φ -1 , we see | N ( u ) ∩ H j | ≥ | N ( v ) ∩ H j | and so equality holds.
Using Lemma 3.1, given an ordering H 1 , . . . , H k of the automorphic similarity classes of H , we define m i,j = c ( H i , H j ) to be the number of neighbors that an arbitrary vertex in H i has in H j . We then define an automorphic similarity matrix of H to be a k × k matrix M = M ( H ) whose ( i, j )-entry is m i,j , where we note that different orderings of the automorphic similarity classes of H may result in different automorphic similarity matrices.
As we will see in Lemma 3.3, the matrix M , together with the sizes of the automorphic similarity classes of H , contains all of the information necessary to count H -colorings of trees. This could also be accomplished using the adjacency matrix of H , but for highly symmetric target graphs the automorphic similarity matrix is much smaller than the adjacency matrix and is thus easier to analyze. First, though, we need the following result, also presented in [19].
Lemma 3.2. Let H be a graph, possibly with loops, and let G be an arbitrary graph with distinguished vertex v . Suppose w and w ′ are automorphically similar vertices in H . Let G be the set of H -colorings of G in which v is sent to w , and let G ′ be the set of H -colorings of G in which v is sent to w ′ . Then |G| = |G ′ | .
Proof. Let φ be an automorphism of H that sends w to w ′ , which exists as w and w ′ are automorphically similar. Then the function mapping f to f ◦ φ is a bijection mapping G to G ′ .
We now present another lemma which we will require in the proof of Theorem 1.5 but which we also feel is also of independent interest: a generalization to the setting of automorphic similarity classes and matrices of the tree-walk algorithm of Csikv´ ari and Lin [6]. Recall that Hom v → i ( T, H ) denotes the set of H -colorings of T in which v is mapped to some (arbitrary but specific) representative of H i and that we set hom v → i ( T, H ) = | Hom v → i ( T, H ) | .
Lemma 3.3. Let H be a target graph with automorphic similarity classes H 1 , . . . , H k and associated automorphic similarity matrix M . Let T be a tree with root v . Let a ( H ) be the row vector whose i th entry is | H i | and define h ( T, v ) to be the column vector whose i th entry is hom v → i ( T, H ) .
Then hom( T, H ) = a ( H ) h ( T, v ) , and h ( T, v ) can be explicitly computed from T , a ( H ) , and M , via recursion on T .
Proof. To see that hom( T, H ) = a ( H ) h ( T, v ), partition Hom( T, H ) according to the color given to v . There are | H i | classes in this partition in which v is send to some vertex in H i , and by Lemma 3.2, each of these classes has size hom v → i ( T, H ). Hence the total count of H -colorings is
$$\sum _ { i = 1 } ^ { k } | H ^ { i } | h o m _ { u \rightarrow i } ( T , v ) .$$
To compute h ( T, v ), we proceed recursively. When T consists of a single vertex, necessarily v , we have that hom v → i ( T, H ) = 1 for any i , and so h ( T, v ) is the constant vector with all entries equal to 1.
When T has more than one vertex, we consider separately the cases deg( v ) = 1 and deg( v ) ≥ 2. When deg( v ) = 1, let w be the neighbor of v . Then
$$\sum _ { j = 1 } ^ { k } h o m _ { w - i } ( T , H ) = \sum _ { j = 1 } ^ { k } n$$
Indeed, once we have colored v with the representative vertex from H i , there are, for each j , m i,j choices for a color from H j for w (as the representative vertex in H i has m i,j neighbors in H j for each j ). For each of these m i,j choices, recalling that hom w → j ( T -v, H ) is independent of the choice of representative from H j due to Lemma 3.2, there are hom w → j ( T -v, H ) ways to extend the coloring to the rest of T . More succinctly, we have
$$h ( T , v ) = M h ( T - v , w )$$
where M is the automorphic similarity matrix of H .
When deg( v ) = d ≥ 2, let w be any neighbor of v . Form tree T ′ by taking the component of T -v that contains w and adding a new vertex v ′ ∼ w . Form tree T ′′ by taking each component of T -v that does not contain w (of which there is at least one, because v has degree at least two) and adding a new vertex v ′′ adjacent to each vertex that was adjacent to v in T . Note that by gluing v ′ to v ′′ we recover T . (See Figure 2.)
Figure 2: Deconstructing T into T ′ and T ′′ .
<details>
<summary>Image 2 Details</summary>

### Visual Description
\n
## Diagram: Tree Partitioning Structure
### Overview
The image displays a mathematical or computer science diagram illustrating a tree data structure that has been partitioned into two distinct subtrees, labeled T' and T''. The diagram uses nodes (black circles), edges (solid lines), and grouping boundaries (dashed lines) to show a hierarchical relationship and a specific division point.
### Components/Axes
* **Nodes:** Represented by solid black circles. The primary labeled nodes are `v` and `w`.
* **Edges:** Solid black lines connecting the nodes, indicating parent-child relationships in the tree.
* **Subtree Labels:**
* `T'`: A label positioned in the top-left area, associated with a dashed boundary.
* `T''`: A label positioned in the top-right area, associated with a different dashed boundary.
* **Node Labels:**
* `v`: The label for the root node of the depicted subtree, located at the top center.
* `w`: The label for a child node of `v`, located to the left and below `v`.
* **Ellipsis (`...`):** Located between two child nodes of `v` on the right side, indicating the presence of additional, unspecified nodes or branches in that section of the tree.
* **Grouping Boundaries:** Two distinct dashed-line shapes that enclose different sets of nodes and their descendants.
* The boundary for `T'` encloses node `w` and its subtree.
* The boundary for `T''` encloses node `v` and all its other children (except `w`) and their subtrees.
### Detailed Analysis
The diagram illustrates a tree rooted at node `v`. Node `v` has multiple children. One specific child, labeled `w`, is highlighted. The tree is partitioned into two parts:
1. **Subtree T':** This consists of node `w` and all of its descendants. It is visually isolated by a dashed line on the left side of the diagram.
2. **Subtree T'':** This consists of the original root node `v` and all of its children *except* for node `w`, along with all their respective descendants. This forms the larger partition on the right side.
The ellipsis (`...`) between the second and third visible child nodes of `v` (counting from the left) signifies that the tree structure is generalized; `v` may have an arbitrary number of children, and the diagram shows only a representative subset.
### Key Observations
* The partition is defined by the removal of the edge connecting `v` to `w`. This single cut splits the original tree into two disjoint subtrees.
* The root of the original subtree (`v`) becomes part of the `T''` partition, not the `T'` partition.
* The diagram is abstract and contains no numerical data, specific algorithms, or contextual labels beyond the structural identifiers (`T'`, `T''`, `v`, `w`).
### Interpretation
This diagram is a canonical representation of a **tree partitioning** or **subtree separation** operation, fundamental in algorithms and data structures. It visually answers the question: "If we cut the link between parent `v` and child `w`, what are the resulting components?"
* **What it demonstrates:** The operation creates two independent trees. `T'` is the subtree that was "rooted" at `w`. `T''` is the remainder of the original tree, which retains `v` as its root but loses the entire branch starting at `w`.
* **Relationship between elements:** The dashed lines are crucial for defining the scope of each resulting partition. They show that `T'` is a proper subset of the original structure, while `T''` is the complementary set.
* **Underlying Concept:** This is a visual proof or explanation for concepts like:
* **Tree decomposition** in graph theory.
* **Splitting a tree** at an edge in algorithms (e.g., for dynamic trees or link-cut trees).
* Defining a **subtree** (`T'`) versus the **remaining tree** (`T''`).
* The **pruning** of a branch (`w`'s subtree) from a larger tree.
The absence of specific data or context suggests this is a theoretical or pedagogical figure, meant to illustrate a general principle rather than a specific instance.
</details>
$$1 6 7 = 1 6 7 \textcircled { + } 1 6 7 = 1 6 7$$
where ⊙ represents the Hadamard product in which entries are multiplied term-wise. Indeed, to H -color T while sending v to H i , we take any H -coloring of T ′ where v ′ is sent to H i (hom v ′ → i ( T ′ , H ) many options) and independently any H -coloring of T ′′ where v ′′ is sent to H i (hom v ′′ → i ( T ′′ , H ) many options).
̸
We require an additional ingredient before proving Theorem 1.5. The KC ordering, introduced by Csikv´ ari [5] (under the name generalized tree shift ) as a generalization of an operation introduced by Kelmans [28], is a partial ordering of trees on n vertices. Let T be a tree, and let v ℓ = v r be two non-leaf vertices with the property that the unique v ℓ to v r path in T is a bare path, meaning that each of the internal vertices, if they exist, have degree two. Let P denote that path on t ≥ 2 vertices.
Removing the edges and internal vertices (if any) of P from T leaves two components; call them L (the one with v ℓ ) and R (the one with v r ). We think of these as the 'left' and 'right' components. Denote by T KC the tree built as follows: glue L and R together into a single tree by identifying the vertices v ℓ and v r ; let v be the vertex created by the identification. Then complete the construction of T KC by appending a path with t -1 new vertices to v ; denote by w the other end of the appended path. We say that T KC is obtained from T by a KC move . Figure 3 illustrates this process. Note that T has at least one such a pair of non-leaf vertices v ℓ , v r exactly if T is not a star.
Then
Figure 3: A demonstration of a KC move.
<details>
<summary>Image 3 Details</summary>

### Visual Description
## Diagram: Graph Transformation via Vertex Identification
### Overview
The image depicts a two-stage diagram illustrating a graph transformation process. On the left is an initial state with two separate components connected by a path. An arrow points to the right, showing the resulting state after a transformation where the two components are merged at a single vertex, and the connecting path is reattached.
### Components/Axes
The diagram is composed of two primary sections separated by a transformation arrow.
**Left Section (Initial State):**
* **Component L:** A circle labeled with the capital letter "L".
* **Component R:** A circle labeled with the capital letter "R".
* **Path P:** A dashed line connecting the two circles. It is labeled with a capital "P" underneath a curly brace that spans its length.
* **Vertices:**
* `v_l`: A solid black dot on the circumference of circle L, marking the left endpoint of path P.
* `v_r`: A solid black dot on the circumference of circle R, marking the right endpoint of path P.
* Two additional unlabeled solid black dots are shown along the dashed path between `v_l` and `v_r`, indicating intermediate vertices.
**Transformation Arrow:**
* A thick, black arrow points from the left section to the right section, indicating the direction of the transformation.
**Right Section (Transformed State):**
* **Merged Components L and R:** The two circles are now tangent, touching at a single point.
* **Vertex v:** A solid black dot at the point of tangency between circles L and R. This vertex is labeled with a lowercase "v".
* **Path P:** The same dashed line, now originating from vertex `v` and extending vertically downward. It is again labeled with a capital "P" next to a curly brace spanning its length.
* **Vertex w:** A solid black dot at the bottom endpoint of the vertical path P, labeled with a lowercase "w".
* **Intermediate Vertices:** Two additional unlabeled solid black dots are shown along the vertical dashed path between `v` and `w`.
### Detailed Analysis
The diagram illustrates a specific graph operation:
1. **Initial Configuration:** Two distinct graph components (represented by circles L and R) are connected by a path `P`. This path has distinct endpoints `v_l` (in L) and `v_r` (in R).
2. **Transformation:** The arrow signifies an operation where the vertices `v_l` and `v_r` are identified (merged) into a single vertex `v`. This operation joins the two previously separate components L and R into one connected component.
3. **Resulting Configuration:** After the merge, the original path `P` is now attached to the new merged vertex `v`. Its other endpoint is labeled `w`. The path `P` is depicted as hanging vertically from `v`.
### Key Observations
* **Spatial Grounding:** The legend/labels (`L`, `R`, `P`, `v_l`, `v_r`, `v`, `w`) are placed directly adjacent to their corresponding graphical elements. The curly braces for `P` are positioned centrally along the path's length in both stages.
* **Visual Consistency:** The graphical representation of the path `P` (dashed line with solid dot vertices) is consistent before and after the transformation, reinforcing that it is the same subgraph being relocated.
* **Topological Change:** The primary change is topological: two connected components become one. The circles L and R change from being disjoint to touching.
* **Vertex Relabeling:** The endpoints `v_l` and `v_r` cease to exist as distinct entities and are replaced by the single vertex `v`. A new endpoint `w` is introduced for the path in the transformed state.
### Interpretation
This diagram is a formal representation of a **vertex identification** or **edge contraction** operation in graph theory. It demonstrates how merging two vertices (`v_l` and `v_r`) that belong to different components (`L` and `R`) can unify the graph's structure.
The operation has significant implications:
* **Connectivity:** It reduces the number of connected components in the graph from two to one.
* **Path Preservation:** The path `P` is preserved but its attachment point changes from two distinct vertices to a single vertex. This could model scenarios like merging two network nodes and rerouting their connecting link.
* **Abstraction:** The use of circles for `L` and `R` suggests they may represent larger subgraphs or sets of vertices, not just single nodes. The transformation shows how a connection between two complex modules can be simplified to a single junction point.
The diagram serves as a clear, abstract visual proof or definition of this graph operation, emphasizing the change in connectivity and vertex relationships without specifying the internal structure of `L`, `R`, or the full path `P`.
</details>
Csikv´ ari [5, Remark 2.4] shows that the relation S ≤ T defined by T being obtainable from S by a sequence of KC moves defines, for each n , a partial order on T n (the KC ordering ), with P n the unique minimal element and S n the unique maximal element.
The following lemma is necessary for the proof of Theorem 1.5. The proof that we give follows [27, Theorem 1.1, ( a ) ⇒ ( c )]. We choose to present the full details because [27] only treats matrices whose row sums are 1.
Lemma 3.4. Let M be a k × k non-negative matrix that has the increasing columns property, and let h be a non-negative column vector of dimension k whose entries are non-decreasing. Then the entries of M h are non-negative and non-decreasing.
Proof. Let S be the k × k matrix that has 1s on and below the main diagonal and 0s everywhere else. Note that S -1 has 1s down the main diagonal, -1s down the first subdiagonal, and 0s everywhere else, which can be verified by computing SS -1 .
Observe that for a vector v = [ v 1 v 2 · · · v k ] T , the entries of v being non-negative and non-decreasing is equivalent to the entries of S -1 v being non-negative because the first entry of S -1 v is v 1 and for i > 1, the i th entry is v i -v i -1 .
We wish to show that the entries of M h are non-negative and non-decreasing. This is implied by S -1 M h having non-negative entries, which is in turn implied by ( S -1 MS )( S -1 h ) having nonnegative entries. By assumption, the entries of h are non-negative and non-decreasing, so the entries of S -1 h are non-negative. Therefore it suffices to show that S -1 MS has non-negative entries.
Let A = S -1 MS . We have
$$= \sum _ { p = 1 } ^ { k } \sum _ { q = 1 } ^ { j } ( S ^ { - 1 } ) _ { i,p } S _ { q } ^ { j }$$
where we adopt the convention that m 0 ,q = 0. Then
$$\sum _ { q = 1 } ^ { j } m _ { i , q } - \sum _ { q = 1 } ^ { j } m _ { i , q - 1 } \geq 0$$
follows from the increasing columns property.
We're now prepared to prove Theorem 1.5, which we restate here for convenience.
Theorem 1.5. Let H be a target graph, possibly with loops. Suppose there is an ordering of the automorphic similarity classes H 1 , . . . , H k of H such that the automorphic similarity matrix M has the increasing columns property. Then H is Hoffman-London.
Proof. We show that when H is a graph whose automorphic similarity classes can be ordered in such a way that the automorphic similarity matrix of H has the increasing columns property, applying a non-trivial KC move to a tree does not decrease the number of H -colorings of the tree. As the path is the minimum element of the KC ordering, it must therefore be among the minimizers of hom( T n , H ).
̸
We begin by establishing some notation. Let T be a non-star tree, let v ℓ = v r be two non-leaf vertices of T with the property that all of the internal vertices on the unique v ℓ to v r path have degree two, let there be t vertices in that internal path, and let T KC be the tree obtained from T by applying a KC move at vertices v ℓ and v r . Suppose H has automorphic similarity classes H 1 , . . . , H k , labeled in such a way that the automorphic similarity matrix M has the increasing columns property. For 1 ≤ i ≤ k , denote by L i the set of H -colorings of L in which v ℓ is mapped to some specific (but arbitrarily chosen) vertex w i of H i , and let ℓ i = |L i | . Define R i and r i analogously (with L replaced by R ). Note that by Lemma 3.2, ℓ i and r i depend only on i and not on the specific choice of w i . For w,v ∈ V ( H ), denote by P t ( w,v ) the set of H -colorings of any path x 1 , . . . , x t (where, recall, t is the number of vertices on the unique path in T between v ℓ and v r ) in which x 1 is mapped to w and x t is mapped to v , and let p t ( w,v ) = |P t ( w,v ) | . Finally, for 1 ≤ i ≤ k and 1 ≤ j ≤ k , let P t i,j = ⋃ w ∈ H i , v ∈ H j P t ( w,v ) and let p t i,j = |P t i,j | = ∑ w ∈ H i , v ∈ H j p t ( w,v ).
The idea at the core of this proof is that hom( T, H ) and hom( T KC , H ) can both be expressed in terms of the ℓ i , r j , and p t i,j s by partitioning the H -colorings according to, in the case of T , the automorphic similarity classes to which v ℓ and v r are mapped, and in the case of T KC , the automorphic similarity classes to which v and w are mapped. In each case we get expressions that are sums of k 2 terms, and we show hom( T KC , H ) ≥ hom( T, H ) by proving each term in the summation expression for hom( T KC , H ) -hom( T, H ) is non-negative.
When computing hom( T, H ), v ℓ and v r can take any pair of values, and those pairs of values then determine the values taken by the endvertices of P , which gives
$$\sum _ { i = 1 } ^ { k } \sum _ { j = 1 } ^ { k } \sum _ { i = 1 } ^ { k } l _ { i r } p ^ { t } ( v , w )$$
Note here that the count of H -colorings of T that send v ℓ to some w ∈ H i and v r to some v ∈ H j should, a priori , involve terms that depend on w and v , but by Lemma 3.2, these terms ( ℓ i and r j , respectively) are independent of w and v .
In contrast, when computing hom( T KC , H ), v ℓ and v r are forced to take a common value, which determines the values at one of the endvertices of P , and the value at the other endvertex is free, which means
$$\sum _ { i = 1 } ^ { k } \sum _ { j = 1 } ^ { l } \sum _ { i = 1 } ^ { k } \sum _ { j = 1 } ^ { l } \ell _ { i r p t } ( w , v )$$
The term ℓ i r i p t i,i appears in both sums for each 1 ≤ i ≤ k . Using symmetry to note p t i,j = p t j,i for 1 ≤ i < j ≤ k , we combine equations (5) and (6) to write
$$\sum _ { i \leq j \leq k } ( l _ { i r } + l _ { j r } - l _ { i r } - l _ { j r } ) = \sum _ { i \leq j \leq k } ( l _ { j } - l _ { i } ) ( r _ { i } - r _ { j } ) p _ { i , j }$$
Noting that trivially p t i,j ≥ 0, we conclude that in order to prove hom( T KC , H ) ≥ hom( T, H ) it suffices to show that for each i, j we have ( ℓ j -ℓ i )( r j -r i ) ≥ 0.
Given any tree T and distinguished vertex v , recall that hom v → i ( T, H ) denotes the number of H -colorings of T in which distinguished vertex v is mapped to some specific (but arbitrarily chosen) vertex w i of H i , where we again note that by Lemma 3.2, hom v → i ( T, H ) depends only on i and not on the specific choice of w i . As in the statement of Lemma 3.3, define h ( T, v ) to be the k -dimensional column vector whose i th entry is hom v → i ( T, H ). We now prove that our condition on the columns of M , the automorphic similarity matrix of H , ensures that h ( T, v ) is non-decreasing for any choice of T and v , so that in particular when i < j ,
$$l _ { i } = h o m _ { v _ { i } } \rightarrow j ( T , H ) = l _ { i }$$
and analogously r j ≥ r i .
To see that h ( T, v ) is always non-decreasing, we use Lemmas 3.3 and 3.4. By the proof of Lemma 3.3, h ( T, v ) can be computed from the all-1s column vector by a sequence of steps that involve taking Hadamard products of vectors and pre-multiplying vectors by M . Combining the facts that the all-1s vector is non-negative and non-decreasing, that the Hadamard product of nonnegative, non-decreasing vectors is both non-negative and non-decreasing, and that (by Lemma 3.4) pre-multiplying by M preserves the properties of being non-negative and non-decreasing, we see that h ( T, v ) is indeed non-negative and non-decreasing. Therefore, ℓ j ≥ ℓ i and r j ≥ r i , so for each i and j , ( ℓ j -ℓ i )( r j -r i ) ≥ 0 and we must have hom( T KC , H ) ≥ hom( T, H ), as desired.
We now restate and prove Corollary 1.6, which gives conditions under which paths uniquely minimize hom( T, H ). The proof focuses on equation (7) in the special case where T is a path, and aims to show that at least one of the summands on the right-hand of equation (7) is (under certain circumstances) strictly positive. As an aside, we note that unlike traditional corollaries which follow directly from a theorem, we justify Corollary 1.6 using ideas from the proof of Theorem 1.5 rather than using the statement. Nonetheless, we use the term 'corollary' to highlight the connection between the results.
Corollary 1.6. Let H be a target graph that satisfies the hypotheses of Theorem 1.5. Suppose that for each t ≥ 2 , there exist two classes, H a ( t ) and H b ( t ) , such that there is at least one H -coloring of P t that sends one endvertex of P t to a vertex in H a ( t ) and the other endvertex to a vertex in H b ( t ) . Suppose further that for all s ≥ 2 , it holds that hom v → b ( t ) ( P s , H ) > hom v → a ( t ) ( P s , H ) , where v is one of the end vertices of P s . Then H is strongly Hoffman-London.
Proof. Let P KC n be obtained from P n by a KC move. Suppose that the bare path involved in this move has t vertices. We have
$$\sum _ { i = j } ^ { n } ( l _ { j } - l _ { i } ) ( r _ { j } - r _ { i } ) p _ { i j }$$
$$\geq ( b ( t ) - l _ { a } ( t ) ) ( r _ { t } ) - r _ { a }$$
Equation (8) is an instance of equation (7), while inequality (9) uses that ( ℓ j -ℓ i )( r j -r i ) p t ij ≥ 0 for all i, j , which comes from the proof of Theorem 1.5.
Note that p t a ( t ) ,b ( t ) counts the number of H -colorings of P t that send one end vertex of P t to something in H a ( t ) and the other end to something in H b ( t ) ; by hypothesis there is at least one such H -coloring, so p t a ( t ) ,b ( t ) > 0.
We have that ℓ b ( t ) = hom v → b ( t ) ( L, H ) and ℓ a ( t ) = hom v → a ( t ) ( L, H ). As L is a path, we have by hypothesis that ℓ b ( t ) > ℓ a ( t ) . Similarly, r b ( t ) > r a ( t ) . We conclude that
$$( b ( t ) - l _ { a } ( t ) ) ( r _ { b } ( t )$$
It follows from inequality (9) that hom( P KC n , H ) -hom( P n , H ) > 0, and so for any T n obtained from P n by a sequence of KC moves, hom( P n , H ) < hom( P KC n , H ) ≤ hom( T n , H ). Since all trees on n vertices can be thus obtained, the result follows.
## 4 Applications of Theorem 1.5
One goal of this paper is to use Theorem 1.5 to contribute towards the resolution of Problem 1.2. Before we state our results in this direction, we summarize some pertinent results from the literature. We then show how Theorem 1.5 can reprove and sometimes strengthen some of these results, before moving on to establishing some new Hoffman-London families.
## 4.1 Two previous results on Hoffman-London graphs
In inequality (1), we saw that the path minimizes the number of independent sets among trees, i.e. that the target graph H ind is Hoffman-London, where, recall, H ind consists of two vertices joined by an edge with a loop at one vertex. Engbers and Galvin [15] gave a significant generalization of this result by showing that target graphs H formed by adding looped dominating vertices (looped vertices that are adjacent to all other vertices) to a regular graph are Hoffman-London, and they characterized the strongly Hoffman-London target graphs in this family.
Theorem 4.1 (Engbers, Galvin [15]) . If H is obtained from a regular graph by adding any number of looped dominating vertices, then H is Hoffman-London. If H is not regular (equivalently, if H is not a fully looped complete graph), then H is strongly Hoffman-London.
A special case of Theorem 4.1-that H is strongly Hoffman-London when it is obtained from an empty (edgeless) graph by adding any non-zero number of looped dominating vertices, or equivalently when it is the join of empty graph and a fully looped complete graph-can also be easily derived from earlier work of Wingard [43, Theorem 5.1 and Theorem 5.2] on independent sets of fixed size in trees.
Csikv´ ari and Lin [7] showed that paths and stars are Hoffman-London.
Theorem 4.2 (Csikv´ ari, Lin [7]) . If H is either a path or a star (on any number of vertices), then for all n and all T n ∈ T n we have
$$\vert \sin ( P _ { 1 } ) \vert < \vert \sin ( P _ { 2 } ) \vert$$
Note that Csikv´ ari and Lin do not comment on when paths and stars are strongly HoffmanLondon. We will address this question for paths in Section 4.3, specifically after the proof of Theorem 4.6. For stars, note that Theorem 2.6 shows that S k is not strongly Hoffman-London for any k .
## 4.2 Target graphs with two automorphic similarity classes
In this section we specialize Theorem 1.5 to the case in which H has exactly two automorphic similarity classes and is not a regular graph. (Note that by Observation 2.3, if H is regular, then hom( T n , H ) is independent of T n , so H is Hoffman-London but not strongly Hoffman-London.) Recall that automorphic similarity classes partition the vertex set, and so in this case all vertices of H are in either H 1 or H 2 . We have the following corollary of Theorem 1.5 and Corollary 1.6.
Corollary 4.3. Let H be a target graph that is not regular and that has two automorphic similarity classes, H 1 and H 2 , ordered so that the (common) degree of any vertex in H 1 is smaller that the (common) degree of any vertex in H 2 . If m 1 , 2 ≤ m 2 , 2 , then H is Hoffman-London. If, further, there is a vertex in H that has a neighbor in H 1 and a neighbor in H 2 , then H is strongly HoffmanLondon.
Proof. To see that H is Hoffman-London, note that for any vertices x ∈ H 1 and y ∈ H 2 we have
$$2 < m _ { 1 } + m _ { 2 } = deg ( y ) .$$
Together with m 1 , 2 ≤ m 2 , 2 , inequality (10) shows that the automporphic similarity matrix of H satisfies the increasing columns property and so is Hoffman-London by Theorem 1.5.
We now use Corollary 1.6 to establish that if H has a vertex that is adjacent to vertices in both H 1 and H 2 , then H is strongly Hoffman-London. Observe that for 0 < a ≤ b we have
and our hypotheses that m 1 , 1 + m 1 , 2 < m 2 , 1 + m 2 , 2 and 0 < a ≤ b imply that 0 < a ′ < b ′ . We use this fact to show that for all s ≥ 2, we have hom v → 1 ( P s , H ) < hom v → 2 ( P s , H ). As in the statement of Lemma 3.3, define h ( T, v ) to be the k -dimensional column vector whose i th entry is hom v → i ( T, H ). When s = 1, the i th entry of h ( P 1 , v ) is hom v → i ( P 1 , H ), the count of H -colorings of P 1 in which v is mapped to a representative of H 1 , so h ( P 1 , v ) = [ 1 1 ] T , where v is the unique
vertex of P 1 . Then for s ≥ 2, we have h ( P s , v ) = M h ( P s -1 , v ), so by equation (11), we conclude hom v → 1 ( P s , H ) < hom v → 2 ( P s , H ).
To apply Corollary 1.6 and conclude that H is strongly Hoffman-London, it remains to show that for all t ≥ 2 there is an H -coloring of P t that sends one end vertex to H 1 and the other to H 2 . Let x ∈ H be the vertex in H that has a neighbor y ∈ H 1 and z ∈ H 2 , noting that x, y , and z need not necessarily be distinct. Suppose x ∈ H 1 . If t is even, then we can map the vertices of P t alternately to x and z ; if t is odd, then we can map the first vertex of P t to y , the second to x , and then proceed as in the even case. The argument for x ∈ H 2 follows analogously.
We can use Corollary 4.3 to recover a portion of Theorem 4.1. Specifically, Corollary 4.3 yields Theorem 4.1 in the case when looped dominating vertices are added to a vertex-transitive (and so necessarily regular) graph. Although the result below is weaker than Theorem 4.1, we include it as it makes our classification of target graphs on three vertices (see Section 5) self-contained.
Proof of special case of Theorem 4.1. Let H be obtained from a vertex transitive graph, say H 0 , by adding some number of looped dominating vertices. If either the number of looped dominating vertices added is zero or H 0 is a fully looped complete graph, then H is regular, and so by Observation 2.3 H is Hoffman-London but not strongly Hoffman-London. For the remainder of the proof, we assume that H 0 is regular but not a fully looped complete graph and at least one looped dominating vertex is added. Let A be the vertex set of H 0 and let B be the set of added looped dominating vertices. Let k be the degree of vertices in H 0 ; note k < | A | .
It is easy to check that H has two automorphic similarity classes, namely H 1 = A and H 2 = B . Let x and y be arbitrary vertices in H 1 and H 2 , respectively. We have
$$m _ { 2 } = \vert B \vert = m _ { 2 }$$
We apply Corollary 4.3 to conclude that H is Hoffman-London. Furthermore, because any vertex in H 2 is adjacent to itself and to all vertices in H 1 , Corollary 4.3 also shows that H is strongly Hoffman-London.
We note that Theorem 1.5 alone could not possibly imply the full strength of Theorem 4.1, which we demonstrate in Example 4.4.
Example 4.4. Consider the Folkman graph [16], a 4-regular graph on 20 vertices constructed from a K 5 by first subdividing each edge and then cloning each of the vertices of the original K 5 ; see Figure 4 where the vertices of the original K 5 are white and those from subdivided edges are black.
The Folkman graph is bipartite and has two automorphic similarity classes, specifically the two bipartition classes. By Theorem 4.1, adding a looped dominating vertex to the Folkman graph produces an H that is Hoffman-London. This H has three automorphic similarity classes: the two bipartition classes of the Folkman graph, each of size ten, which we call A and B , and the looped dominating vertex, which we call C . Vertices in A (respectively, B ) have no neighbors in A (respectively, B ), four neighbors in B (respectively, A ), and one neighbor in C . The vertex in C has ten neighbors in A , ten in B and one in C . So if we set A = H 1 , B = H 2 (or A = H 2 , B = H 1 )
and
Figure 4: The Folkman graph.
<details>
<summary>Image 4 Details</summary>

### Visual Description
\n
## Diagram: Network Graph with Black and White Nodes
### Overview
The image displays a complex, symmetric network graph composed of nodes (circles) and connecting edges (lines). There is no embedded text, numerical data, labels, or axes. The diagram is purely structural, illustrating connections between two distinct sets of nodes differentiated by color.
### Components/Axes
* **Nodes:** There are 16 circular nodes in total.
* **Black Nodes:** 8 solid black circles.
* **White Nodes:** 8 solid white circles with a black outline.
* **Edges:** Straight black lines connect the nodes. Each node is connected to multiple others, forming a dense web.
* **Legend/Labels:** None present. Node differentiation is solely by color (black vs. white).
* **Axes/Scale:** None. This is a topological graph, not a plot with quantitative axes.
### Content Details
The graph exhibits a high degree of symmetry and appears to be a specific type of mathematical or computational graph structure.
* **Spatial Layout & Connectivity:**
* The nodes are arranged in a roughly circular or spherical projection.
* There is a clear pattern of connectivity: each black node appears to be connected to several white nodes, and vice-versa. There are no visible direct connections between nodes of the same color (black-to-black or white-to-white), suggesting this may be a **bipartite graph**.
* The connections create a complex, interwoven pattern with a central, denser region of overlapping edges.
* The overall structure is highly regular and symmetric, implying it represents an idealized or theoretical network rather than a random or organic one.
* **Node Placement (Approximate):**
* **Outer Ring:** 8 nodes (4 black, 4 white) form an approximate outer octagon.
* **Inner Structure:** The remaining 8 nodes (4 black, 4 white) are positioned inside this ring, connected to both outer nodes and each other, creating layered depth.
### Key Observations
1. **Bipartite Nature:** The strict separation of connections between black and white node sets is the most prominent structural feature.
2. **High Symmetry:** The graph possesses rotational and reflective symmetry, indicating a designed, non-random structure.
3. **High Connectivity:** Each node has a relatively high degree (number of connections), resulting in a dense network.
4. **Absence of Textual or Quantitative Data:** The diagram conveys information purely through topology and node categorization. No metrics, labels, or titles are provided.
### Interpretation
This diagram is a visual representation of a **bipartite graph**, a fundamental concept in graph theory and network science. In such a graph, nodes are divided into two disjoint sets (here, black and white), and edges only connect nodes from different sets.
* **What it Demonstrates:** It visually encodes relationships or interactions between two distinct classes of entities. For example, it could model:
* Users and the items they purchase.
* Authors and the papers they cite.
* Tasks and the machines capable of processing them.
* Any scenario where connections only exist between different types of things.
* **Underlying Structure:** The specific symmetric pattern suggests this might be a well-known graph from mathematics or computer science, such as a **Heawood graph**, **Moore graph**, or a representation of a **hypercube** or other polytope. Without labels, the exact identity is uncertain, but its properties (bipartite, symmetric, regular) are clear.
* **Purpose:** The image serves as an abstract, technical illustration of network connectivity principles. Its value lies in showing the pattern and density of relationships between two categorized groups, not in presenting empirical data. To extract specific meaning, contextual labels (e.g., "Server," "Client," "Product," "User") would be required.
</details>
and C = H 3 , the corresponding automorphic similarity matrix is
$$M = \left[ \begin{array}{ccc} 0 & 4 & 1 \\ 4 & 0 & 1 \\ 10 & 10 & 1 \end{array} \right] ,$$
which fails the increasing columns property of Theorem 1.5. Setting C = H 1 or C = H 2 also produces automorphic similarity matrices that fail the increasing columns property, meaning that Theorem 1.5 cannot be used to conclude that H is Hoffman-London.
As an example of the utility of Corollary 4.3 to obtain new families of Hoffman-London graphs, consider the family of constraint graphs H ( a, b, ℓ ), defined for a, b ≥ 1 and ℓ ≥ 0. Start with a clique on b vertices. At each vertex v of the clique, append ℓ copies of a clique on a vertices, with v the only vertex in common to the ℓ appended cliques; see Figure 5. This family generalizes many well known families of graphs; for example, H (2 , 1 , ℓ ) is the star graph, H (2 , 2 , ℓ ) is the balanced double star -an edge with ℓ edges appended to each endpoint, and H (3 , 1 , ℓ ) is the fan graph -a collection of triangles sharing a single common vertex. More generally, H ( a, 1 , ℓ ) is a collection of ℓ cliques on a vertices each, all sharing a single common vertex; following standard notation from topology, we call this a bouquet of ℓ a -cliques .
For some choices of parameters, it is straightforward to verify whether H ( a, b, ℓ ) is HoffmanLondon. In the cases ℓ = 0, ℓ ≥ 1 and a = 1, or ℓ = 1 and b = 1, we see H ( a, b, ℓ ) is just a complete graph, which is regular and so Hoffman-London but not strongly Hoffman-London. If ℓ ≥ 2, a = 2, and b = 1 then H ( a, b, ℓ ) is, as observed above, a star, and so is Hoffman-London by Theorem 4.2, although, as we saw in Theorem 2.6, it is not strongly Hoffman-London. For many other choices of parameters, we can use Corollary 4.3.
Corollary 4.5. For all a, b ≥ 2 and ℓ ≥ 1 , the target graph H ( a, b, ℓ ) is strongly Hoffman-London.
Proof. The graph H ( a, b, ℓ ) has two automorphic similarity classes: H 2 consisting of the b vertices of the initial clique and H 1 consisting of all ℓ ( a -1) added vertices. We have m 1 , 2 = 1 ≤ b -1 = m 2 , 2 , and if x is any vertex in H 1 and y any vertex in H 2 we have deg( x ) = a -1 < b -1+ ℓ ( a -1) = deg( y ). We conclude that H is Hoffman-London by Corollary 4.3.
Figure 5: A visualization of the family of graphs H ( a, b, ℓ ).
<details>
<summary>Image 5 Details</summary>

### Visual Description
\n
## Diagram: Graph Construction with Complete Subgraphs
### Overview
The image is a mathematical diagram illustrating a graph construction. It depicts a large complete graph \(K_b\) connected to two distinct vertices (black dots), each of which is further connected to multiple copies of a smaller complete graph \(K_a\). The diagram uses standard graph theory notation where \(K_n\) denotes a complete graph on \(n\) vertices.
### Components
1. **Central Component**: A large circle labeled \(K_b\), representing a complete graph with \(b\) vertices.
2. **Connection Vertices**: Two solid black dots. These are individual vertices that serve as connection points.
* One vertex is located on the upper-right boundary of the \(K_b\) circle.
* The second vertex is located on the lower-right boundary of the \(K_b\) circle.
3. **Attached Subgraphs**: Multiple loops (ovals) labeled \(K_a\), each representing a complete graph with \(a\) vertices.
* **Upper Group**: Attached to the upper connection vertex. It consists of three visible \(K_a\) loops. A dotted arc spans two of these loops, accompanied by the text "\(\ell\) times".
* **Lower Group**: Attached to the lower connection vertex. It consists of four visible \(K_a\) loops. A dotted arc spans three of these loops, accompanied by the text "\(\ell\) times".
4. **Text Labels**:
* \(K_b\): Centered inside the large circle.
* \(K_a\): Centered inside each of the smaller loops.
* \(\ell\) times: Appears twice, each time near a dotted arc indicating a repeated structure.
### Detailed Analysis
* **Spatial Layout**: The diagram is asymmetric. The \(K_b\) circle dominates the left side. The two connection vertices are positioned on its right-hand side, one above the other. The \(K_a\) loops fan out to the right from these vertices.
* **Structure Interpretation**:
* The diagram suggests that each of the two selected vertices from the graph \(K_b\) is connected to a set of \(\ell\) disjoint copies of the complete graph \(K_a\).
* The dotted arcs with the "\(\ell\) times" label are a schematic shorthand. They indicate that the number of \(K_a\) loops attached to each connection vertex is not limited to the three or four drawn, but is a parameter \(\ell\). The drawn loops are representative.
* The connection between a vertex from \(K_b\) and a copy of \(K_a\) is typically a single edge in such constructions, though the diagram represents this connection by having the loop emanate from the vertex dot.
### Key Observations
1. **Repetition Notation**: The use of "\(\ell\) times" with a dotted arc is a common convention in graph theory diagrams to denote an arbitrary number of identical, disjoint components attached in the same way.
2. **Symmetry of Construction**: The same attachment process (\(\ell\) copies of \(K_a\)) is applied to two distinct vertices of the base graph \(K_b\).
3. **Visual Abstraction**: The diagram abstracts away the internal structure of both \(K_b\) and \(K_a\), representing them as simple shapes. The focus is solely on the connection pattern between these components.
### Interpretation
This diagram visually defines a specific family of graphs. It describes a construction where you start with a complete graph on \(b\) vertices (\(K_b\)). You then select two of its vertices and, to each of these two vertices, attach \(\ell\) disjoint copies of a complete graph on \(a\) vertices (\(K_a\)).
The resulting graph's properties (like connectivity, chromatic number, or existence of certain subgraphs) would depend critically on the parameters \(a\), \(b\), and \(\ell\). This type of construction is often used in graph theory to create counterexamples, study graph parameters, or explore the limits of theorems. The diagram efficiently communicates a complex, parameterized graph structure that would be cumbersome to describe in text alone. The two attachment points suggest the construction might be studying the effect of modifying specific, possibly symmetric, vertices within the base graph \(K_b\).
</details>
Any vertex in H 2 is adjacent to at least one other vertex in H 2 and at least one vertex in H 1 , and so Corollary 4.3 also demonstrates that H is strongly Hoffman-London.
The graphs H ( a, b, ℓ ) that are not covered by Corollary 4.5 or the preceding discussion are those with a ≥ 3, b = 1, and ℓ ≥ 2. In these cases, as mentioned above, H is a bouquet of ℓ a -cliques. Such target graphs have two automorphic similarity classes: one that includes only the central vertex and another that includes all other vertices. For this collection of target graphs, we cannot apply Theorem 1.5: if we choose the central vertex to be H 1 , then the associated automorphic similarity matrix has ℓ ( a -1) > 1 as its (1 , 2)-entry and 1 as its (2 , 2)-entry and fails the increasing columns property, while if we choose the central vertex to be H 2 then the associated automorphic similarity matrix has 1 as its (1 , 2)-entry and 0 as its (2 , 2)-entry, and so again fails the increasing columns property. It is unknown which graphs of this subfamily, if any, are Hoffman-London.
## 4.3 Paths and looped paths as target graphs
In this section, we consider paths and fully looped paths as target graphs. We begin by using Theorem 1.5 to recover part of Theorem 4.2, specifically the case when H is a path with an even number of vertices. We also strengthen this portion of Theorem 4.2 by using Corollary 1.6 to show that target graphs H in this case with at least four vertices are strongly Hoffman-London.
Theorem 4.6. The path graph P 2 m is Hoffman-London for all m ≥ 1 and strongly Hoffman-London for all m ≥ 2 .
Proof. Let H = P 2 m for some m ≥ 1, specifically with vertices u 1 , . . . , u 2 m and edges u i u i +1 for 1 ≤ i ≤ 2 m -1. We claim that P 2 m admits exactly two automorphisms: the identity and the
reversal map that sends u i to u 2 m +1 -i for i = 1 , . . . , 2 m . To verify this claim, first note that an automorphism of P 2 m must preserve degrees and thus leaves must be mapped to leaves. If u 1 is mapped to u 1 , then u 2 , the unique neighbor of u 1 , must be mapped to u 2 , u 3 must be mapped to u 3 , and so on, resulting in the identity map. If u 1 is instead mapped to u 2 m , then u 2 must be mapped to u 2 m -1 , u 3 must be mapped to u 2 m -2 , and so on, giving resulting in the reversal map. It follows that P 2 m has m automorphic similarity classes, each consisting of a pair of vertices { u i , u j } such that i + j = 2 m +1.
Set H i = { u i , u 2 m +1 -i } for i = 1 , . . . , m . The automorphic similarity matrix of H with respect to this ordering of the automorphic similarity classes has the form
That is, it has 1s on the first subdiagonal, 1s on the first superdiagonal, a 1 in the ( m,m ) position, and 0s everywhere else. It is straightforward to check that all terminal sums of the i th row are less than or equal to the corresponding terminal sums of the ( i + 1) th row for all i < m , and so this matrix satisfies the increasing columns property, establishing that P 2 m is Hoffman-London by Theorem 1.5.
We now turn to establishing that P 2 m is strongly Hoffman-London for m ≥ 2. We seek to apply Corollary 1.6. Towards this goal, we will show by induction on t that
$$\sum _ { i = 1 } ^ { n } \sum _ { j = 1 } ^ { m } ( P _ { i } , P _ { j } ) < h o m _ { v - 1 } ( P _ { i } , P _ { 2 m } ) < \ldots < h o m _ { v - 3 } ( P _ { i } , P _ { 2 m } ) = \ldots$$
where v is an endvertex of P t .
When t = 2, inequality (12) holds because hom v → 1 ( P 2 , P 2 m ) = 1 and hom v → i ( P 2 , P 2 m ) = 2 for 2 ≤ i ≤ m . For t > 2, let v ′ be the unique neighbor of v in P t . Using equation (4) from the proof of Lemma 3.3, we have the recurrence relations
$$\begin{array}{ll}
h o m v \rightarrow ( P _ { 1 } , P _ { 2 m } ) = h o & \\
h o m v \rightarrow ( P _ { 1 } , P _ { 2 m } ) = h o & \\
h o m v \rightarrow m ( P _ { 1 } , P _ { 2 m } ) = h o & \\
\end{array}$$
By induction, we have
$$\sum _ { i = 1 } ^ { n } h ( m _ { i } - 1 ) ( P _ { i } - 1 , P _ { 2 m } ) < \ldots < h ( m _ { n } - 1 )$$
When m = 2, we use the fact that hom v ′ → 1 ( P t -1 , P 4 ) > 0 to say
$$\hom _ { v } \left ( P _ { t - 1 } , P _ { 4 } \right ) = \hom _ { v } - 2 \left ( P _ { t - 1 } , P _ { 4 } \right ) = \hom _ { p } \left ( P _ { t - 1 } , P _ { 4 } \right ).$$
which establishes inequality (12).
For m ≥ 3, we first establish hom v → 1 ( P t , P 2 m ) < hom v → 2 ( P t , P 2 m ). By inequality (13), we see that hom v ′ → 2 ( P t -1 , P 2 m ) ≤ hom v ′ → 3 ( P t -1 , P 2 m ). Then, again using hom v ′ → 1 ( P t -1 , P 2 m ) > 0, we have
Next, we show hom v → i ( P t , P 2 m ) ≤ hom v → i +1 ( P t , P 2 m ) for 2 ≤ i ≤ m -2 by using inequality (13) to say
$$\hom _ { v } ^ { - i } ( P _ { t - 1 } , P _ { 2 m } ) = \hom _ { v } ^ { - i + 1 } ( P _ { t - 1 } , P _ { 2 m } )$$
Finally, we demonstrate hom v → m -1 ( P t , P 2 m ) ≤ hom v → m ( P t , P 2 m ) by once again using inequality (13):
which establishes inequality (12).
Finally, we use inequality (12) to show that P 2 m is strongly Hoffman-London. We consider the cases m = 2 and m > 2 separately. For m = 2 and t ≥ 2, set a ( t ) = 1 and b ( t ) = 2. From inequality (12), we have hom v → a ( t ) ( P s , P 4 ) < hom v → b ( t ) ( P s , P 4 ) for all s ≥ 2. To apply Corollary 1.6 to show that P 4 is strongly Hoffman-London, it remains to show that for all t ≥ 2 there is a P 4 -coloring of P t that sends one endvertex v of P t to some vertex in H 1 and the other endvertex w to some vertex in H 2 . For t even, such a coloring is given by sending v to u 1 ∈ H 1 , then sending the vertices of P t alternatively to u 2 and u 1 , ending by sending w to u 2 ∈ H 2 . For t odd, such a coloring is given by sending v to u 1 ∈ H 1 , then sending the vertices of P t alternatively to u 2 and u 1 except for w , which is sent to u 3 ∈ H 2 instead of u 1 .
For m > 2, set a ( t ) = 1 for t ≥ 2, and set b ( t ) = 2 for even t ≥ 2 and set b ( t ) = 3 for odd t ≥ 2. From inequality (12), we again have hom v → a ( t ) ( P s , P 2 m ) < hom v → b ( t ) ( P s , P 2 m ) for all s ≥ 2. To exhibit a coloring of P t that sends v to a vertex in H a ( t ) and w to a vertex in H b ( t ) , we proceed exactly as in the m = 2 case: send v to u 1 ∈ H 1 and proceed to alternate sending the vertices of P t to u 2 and u 1 , ending by sending w to u 2 ∈ H 2 when t is even and deviating by sending w to u 3 ∈ H 3 instead of u 1 when t is odd.
Theorem 4.6 shows that P ℓ is strongly Hoffman-London for all even ℓ ≥ 4, and Observation 2.3 establishes that P 2 is not strongly Hoffman-London, because all trees admit the same number of P 2 -colorings. What can be said of P ℓ when ℓ is odd? As demonstrated in Theorem 2.6, P 3 is not strongly Hoffman-London but in a different manner than P 2 : trees with a balanced bipartition, including paths but not all trees, minimize hom( T n , P 3 ). We know that paths are among the
minimizers of hom( T n , P ℓ ) for odd ℓ due to Csikv´ ari and Lin [7], but the precise set of trees that minimize the number of P ℓ -colorings for odd ℓ ≥ 5 remains open.
The fully looped path P ◦ n is the graph on vertex set u 1 , . . . , u n with edges u i u i +1 for 1 ≤ i ≤ n -1 and u i u i for each 1 ≤ i ≤ n . Unlike for unlooped paths, for which we were only able to apply our techniques when the number of vertices is even, we can completely classify all fully looped paths.
Theorem 4.7. The fully looped path P ◦ n is Hoffman-London for all n ≥ 1 and strongly HoffmanLondon for all n ≥ 3 .
Proof. For n = 1 or 2, P ◦ n is a fully looped complete graph and so Observation 2.3 tells us that P ◦ n is Hoffman-London but not strongly Hoffman-London. When n = 3, P ◦ n is an instance of a regular graph with a looped dominating vertex added, and so applying Theorem 4.1 establishes that P ◦ n is strongly Hoffman-London.
For n ≥ 4, we start by considering n even, say n = 2 m . Exactly as for P 2 m , we find that the automorphic similarity classes of P ◦ 2 m are H i = { v i , v 2 m +1 -i } , i = 1 , . . . , m , each of size 2, and the the associated automorphic similarity matrix has the form
$$\begin{bmatrix} 1 & 1 & 0 & 0 & \cdots & 0 & 0 \\ 1 & 1 & 1 & 0 & \cdots & 0 & 0 \\ 0 & 1 & 1 & 1 & \cdots & 0 & 0 \\ \vdots & \vdots & \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & 0 & 0 & \cdots & 1 & 1 \\ 0 & 0 & 0 & 0 & \cdots & 0 & 1 \\ \end{bmatrix}$$
(i.e., it is the automorphic similarity matrix of P 2 m modified by the addition of the identity matrix I 2 m ). It is straightforward to check that all terminal sums of the i th row are less than or equal to the corresponding terminal sums of the ( i +1) th row for all i < m , and so this matrix satisfies the increasing columns property and P ◦ 2 m is Hoffman-London.
For n ≥ 5 and odd, say n = 2 m -1, the automorphic similarity classes of P ◦ 2 m -1 are H i = { v i , v 2 m -i } , i = 1 , . . . , m -1, each of size 2, and H m = { v m } (of size 1). The structure of the corresponding automorphic similarity matrix is the same as in the even case, except that in the final row the 1 and 2 are flipped. That M satisfies the increasing columns condition follows almost exactly as it did in the even case.
We have established that P ◦ n is Hoffman-London for all n and strongly Hoffman-London for n = 3. To establish that P ◦ n is strongly Hoffman-London for n ≥ 4, we will show that Corollary 1.6 can be applied. We first observe that for all s ≥ 2 we have hom v → 1 ( P s , P ◦ n ) < hom v → 2 ( P s , P ◦ n ), where v is one of the endvertices of P s . We can see this via a non-surjective injection from Hom v → 1 ( P s , P ◦ n ) into Hom v → 2 ( P s , P ◦ n ). For concreteness, we take a fixed x ∈ H 1 (necessarily one of the endvertices of P ◦ n ) to be the representative of H 1 that v is mapped to, and take its unique neighbor x ′ ∈ H 2 to be the representative of H 2 that v is mapped to. The injection is simple: given f ∈ Hom v → 1 ( P s , P ◦ n ) with f ( v ) = x , map f to f ′ by changing the value at v to x ′ . That f ′ ∈ Hom v → 2 ( P s , P ◦ n ) (i.e., that f ′ is actually a homomorphism) follows from the fact that f ′ maps v ′ (the unique neighbor of v in P s ) to a neighbor of x , so either x or x ′ , and both of these are neighbors of x ′ . To see that the injection is not a surjection, consider f ′ ∈ Hom v → 2 ( P s , P ◦ n ) that sends v to x ′ and all other vertices of P s to x ′′ , the neighbor of x ′ in H 2 . There is no f ∈ Hom v → 1 ( P s , P ◦ n ) that maps onto f ′ via our injection, since x is not adjacent to x ′′ in P ◦ n .
We next observe that for all t ≥ 2 there is a P ◦ n -coloring of P t that sends one endvertex to a vertex in H 1 and the other to a vertex in H 2 -simply send one endvertex to x and all the others to x ′ . That P ◦ n ( n ≥ 4) is strongly Hoffman-London now follows from Corollary 1.6.
We end our discussion of paths with a quick corollary of Theorem 4.7 to the family of ladder graphs . The ladder graph L n has vertices u 1 , . . . , u n , v 1 , . . . , v n and edges u i u i +1 and v i v i +1 , for 1 ≤ i ≤ n -1, and u i v i for 1 ≤ i ≤ n .
Corollary 4.8. For n ≥ 1 , the ladder graph L n is Hoffman-London. For n ≥ 3 it is strongly Hoffman-London.
Proof. Wehave L n = P ◦ n × K 2 , so the proposition follows from Observation 2.5 and Theorem 4.7.
## 4.4 Loop threshold graphs
Here we define the family of loop threshold graphs and prove that they are Hoffman-London. Denote by K 1 the graph on one vertex with no edges, and by K ◦ 1 the graph on one vertex with a loop. The family of loop threshold graphs is the minimal (by inclusion) family of graphs that includes K 1 and K ◦ 1 , is closed under adding isolated vertices, and is closed under adding looped dominating vertices. Loop threshold graphs were introduced by Cutler and Radcliffe [13] where they were shown to be a natural family to consider in the context of H -coloring; one of the results of [13] is that if H is loop threshold, then among n -vertex m -edge graphs, there is a threshold graph that maximizes the number of H -colorings. For further work on loop threshold graphs, see also [11] (regarding H -coloring) and [21] (regarding enumeration).
The following characterization of loop threshold graphs is stated in [13], where the authors suggest adapting a proof from [8]. For completeness, we present the details here.
Lemma 4.9. For a graph H on n vertices, possibly with loops, the following are equivalent.
1. H is loop threshold.
2. There is an ordering of the vertices of H , say v 1 , v 2 , . . . , v n , with the property that
$$\frac { 1 } { n } \times ( 1 - \frac { 1 } { n } ) = \frac { 1 } { n }$$
We refer to an ordering of vertices satisfying the second condition in Lemma 4.9 as a nested neighborhood ordering .
Proof. We begin by showing that if H is a loop threshold graph, then it admits a nested neighborhood ordering. Let O be the family of graphs for which a nested neighborhood ordering exists. Trivially, K 1 , K ◦ 1 ∈ O . Suppose now that H ∈ O . Let v 1 , . . . , v n be a nested neighborhood ordering for H . If H ′ is obtained from H by adding an isolated vertex, u , then since N H ′ ( u ) = ∅ we have that u, v 1 , . . . , v n is a nested neighborhood ordering for H ′ , so H ′ ∈ O . If instead H ′ is obtained from H by adding a looped dominating vertex, u , then since N H ′ ( u ) = { v 1 , . . . , v n , u } , we have that v 1 , . . . , v n , u is a nested neighborhood ordering for H ′ , so H ′ ∈ O . It follows from the definition of the family of loop threshold graphs that all loop threshold graphs are in O .
For the reverse implication, let H be a graph on n vertices that admits a nested neighborhood ordering v 1 , . . . , v n . We show by induction on n that H is loop threshold. When n = 1 this is clear, so we assume n > 1.
If N ( v n ) = { v 1 , . . . , v n } , then v n is a looped dominating vertex in H . Let H ′ be obtained from H by deleting v n . We have that
$$v _ { n } \in N ( v _ { n - 1 } ) \cap N ( v _ { n } )$$
and so v 1 , . . . , v n -1 is a nested neighborhood ordering for H ′ . By induction, H ′ is a loop threshold graph, so H , being obtained from H ′ by adding a looped dominating vertex, is also loop threshold.
Otherwise there is v i in H that is not in N ( v n ). Then v i must be isolated: if there were v j such that v i ∼ v j , then v i ∈ N ( v j ) ⊆ N ( v n ). So we observe that for each j , N ( v j ) \ { v i } = N ( v j ), meaning if H ′ is obtained from H by deleting v i , then H ′ admits the nested neighborhood ordering v 1 , . . . , v i -1 , v i +1 , . . . , v n . By induction, H ′ is a loop threshold graph, and as we obtain H by adding v i , an isolated vertex, H is also loop threshold.
Using Lemma 4.9, we now prove that loop threshold graphs are Hoffman-London, and we characterize the strongly Hoffman-London loop threshold graphs.
Theorem 4.10. Each loop threshold graph is Hoffman-London. Furthermore, if H is a loop threshold graph, then H is strongly Hoffman-London if and only after removing all isolated vertices from H , the resulting graph is neither empty nor a fully looped complete graph.
Note that we consider a vertex v isolated if N ( v ) ⊆ { v } . In other words, a single vertex with a loop is still an isolated vertex.
Proof. Let H be a loop threshold graph. To establish the Hoffman-London property, we seek to apply Theorem 1.5. By Lemma 4.9, there is an ordering of the vertices of H , say v 1 , v 2 , . . . , v n , with the property that
$$\frac { 1 } { n } \sum _ { k = 1 } ^ { n } ( - 1 ) ^ { k }$$
̸
Suppose v i ∈ H a and v i +1 ∈ H b for some a = b . We claim v j / ∈ H a for all j > i . Suppose otherwise: assume v i ∈ H a , v i +1 ∈ H b , and v j ∈ H a for some j > i . As v a and v j both belong to H a , | N ( v i ) | = | N ( v j ) | , and as i < j , N ( v i ) ⊆ N ( v j ), so N ( v i ) = N ( v j ). Then
$$The formula you've provided is not in a standard mathematical notation. Could you please provide more context or clarify what you're asking? I'm here to help with any mathematical problems or concepts you have!$$
so N ( v i +1 ) = N ( v i ) as well. Thus the function on V ( H ) that swaps v i and v i +1 and fixes all other vertices is an automorphism of H , contradicting that v i +1 / ∈ H a . We conclude that this ordering of the vertices induces an ordering of the automorphic similarity equivalence classes of H , say H 1 , . . . , H k , such that if u ∈ H i and v ∈ H j for i < j then N ( u ) ⊆ N ( v ). If we use this ordering of the automorphic similarity classes to define the automorphic similarity matrix M , we see that each column of M is non-decreasing: for any 1 ≤ j ≤ k and 1 ≤ i ≤ k -1, m i,j ≤ m i +1 ,j because for any u ∈ H i and v ∈ H i +1 , N ( u ) ⊆ N ( v ), so each w ∈ H j is adjacent to u only if it is adjacent to v . Sums of non-decreasing vectors are also non-decreasing, so this ordering of the automorphic similarity classes of H satisfies the hypothesis of Theorem 1.5. We conclude that H is Hoffman-London, as claimed.
We now turn to the characterization of loop threshold graphs that are strongly Hoffman-London. For n ≥ 2, each vertex of any tree on n vertices is incident to an edge and cannot be mapped to an isolated vertex of H . (This is essentially the content of Observation 2.2). Thus if H is loop threshold and H ′ is the graph resulting from removing the isolated vertices from H , we have
hom( T n , H ) = hom( T n , H ′ ) for all n ≥ 2. If H ′ is empty, then hom( T n , H ′ ) = 0 for all T n ∈ T n . If H ′ is a fully looped complete graph, it is regular, so each T n ∈ T n admits the same number of H -colorings by Observation 2.3.
Now assume that upon deletion of all isolated vertices, H is non-empty and not a fully looped complete graph. Note that the resulting graph is also loop threshold: because only isolated vertices were removed, the neighborhoods of the remaining vertices are unchanged, so the resulting graph still admits a nested neighborhood ordering. Thus it suffices to assume H simply has no isolated vertices and is not a fully looped complete graph. To show that H is strongly Hoffman-London, we use Corollary 1.6.
We begin by showing that H has at least two automorphic similarity classes. If every vertex of H is looped, then H must have been constructed by adding looped dominating vertices at every step, contradicting that H is not a fully looped complete graph. Let u be a vertex of H without a loop. Because u was not removed, it is not isolated, so there is v ∼ u . We claim that u comes before v in the nested neighborhood ordering of H ; if not, then u ∈ N ( v ) ⊆ N ( u ), which contradicts our claim that u does not have a loop. By similar reasoning, v must have a loop: v ∈ N ( u ) ⊆ N ( v ). Because u does not have a loop but v does, no automorphism of H can send u to v . We conclude H has at least two distinct automorphic similarity classes.
Now for each t ≥ 2, let a ( t ) = i , where H i is the class containing u , and let b ( t ) = j , where H j is the class containing v . We first observe that there is an H -coloring of P t that sends the first vertex of the path to some vertex in H a ( t ) and the last vertex of the path to some vertex in H b ( t ) . Indeed, mapping the first vertex of P t to u and all subsequent vertices to v yields such a coloring.
To apply Corollary 1.6 to conclude that H is strongly Hoffman-London, it remains to show that for each t ≥ 2 and s ≥ 2, hom w → b ( t ) ( P s , H ) > hom w → a ( t ) ( P s , H ) where w is one of the endvertices of P s . In what follows, when enumerating hom w → a ( t ) ( P s , H ) (respectively, hom w → b ( t ) ( P s , H )) we will take u (respectively, v ) to be the specific vertex of H a ( t ) (respectively, H b ( t ) ) to which w is mapped in an H -coloring. There is a simple injection from Hom w → a ( t ) ( P s , H ) into Hom w → b ( t ) ( P s , H ): change the color at w from u to v . That this is a valid map comes from the fact that N ( u ) ⊆ N ( v ). To see that this injection is not a bijection, and so hom w → b ( t ) ( P s , H ) > hom w → a ( t ) ( P s , H ), consider the H -coloring of P s that sends w to v , that sends the unique neighbor of w in P s to u , and that sends all other vertices of P s (if there are any) to v . This coloring is in Hom w → b ( t ) ( P s , H ), but, since there is not a loop at u in H , it is not the image of any coloring in Hom w → a ( t ) ( P s , H ).
We conclude our discussion of loop threshold graphs with an application of Theorem 4.10 to a family of target graphs that have arisen both in statistical physics and communications networks. For integer C ≥ 1, define a capacity C independent set in a graph G to be a function f : V ( G ) → { 0 , 1 , 2 , . . . , C } satisfying f ( x )+ f ( y ) ≤ C for all xy ∈ E ( G ). Standard independent sets are capacity 1 independent sets. This notion was introduced by Mazel and Suhov [33] as a statistical physics model of a random surface and was subsequently studied in [18, 35] in the context of multicast communications networks. Capacity C independent sets can be encoded as H -colorings by taking H = H C to be the graph on vertex set { 0 , 1 , . . . , C } with edge set {{ a, b } : a + b ≤ C } . The graph H C admits a nested neighborhood ordering, namely C, C -1 , . . . , 1 , 0, since N H C ( i ) = { 0 , 1 , . . . , C -i } , so by Lemma 4.9 H C is loop threshold, and the following result is a corollary of Theorem 4.10.
Corollary 4.11. For C ≥ 1 and n ≥ 1 , if T n is a tree on n vertices that is different from P n , then P n admits strictly fewer capacity C independent sets than T n .
## 4.5 Adding looped dominating vertices to a target graph
In this section we establish that the set of graphs H for which Theorem 1.5 is applicable is closed under adding looped dominating vertices and present an application that will be useful later.
Proposition 4.12. Suppose that H is a target graph that has an automorphic similarity matrix satisfying the increasing columns property. Let ˜ H be obtained from H by adding some number of looped dominating vertices. Then ˜ H is Hoffman-London.
Proof. We begin by considering the case where H does not have any looped dominating vertices. Let the automorphic similarity classes of H be H 1 , . . . , H k , presented in such an order that the associated automorphic similarity matrix M satisfies the increasing columns property. Then let ˜ H be obtained from H by adding b looped dominating vertices.
Any automorphism of ˜ H must be an extension of an automorphism of H obtained by permuting the looped dominating vertices of ˜ H , and so ˜ H has automorphic similarity classes ˜ H 1 , . . . , ˜ H k , ˜ H k +1 where ˜ H i = H i for i ≤ k and ˜ H k +1 is the set of b added looped dominating vertices. The associated automorphic similarity matrix ˜ M is obtained from M by adding a new row at the bottom whose j th entry ( j = 1 , . . . , k ) is | H j | , and then adding a new column at the far left, all of whose entries are b . The last column of this matrix is constant and so non-decreasing.
For i and j with 1 ≤ i ≤ k -1 and 1 ≤ j ≤ k , the sum ˜ S i,j of the entries of row i of ˜ M starting from the j th entry is b greater than S i,j , the sum of the entries of row i of M starting from the j th entry, and similarly ˜ S i +1 ,j = S i +1 ,j + b , so ˜ S i,j ≤ ˜ S i +1 ,j follows from S i,j ≤ S i +1 ,j .
Finally, the ( k, j )-entry (1 ≤ j ≤ k ) of ˜ M , which counts the number of neighbors an element of H k has in H j , is trivially bounded above by | H j | , which is the ( k +1 , j )-entry of ˜ M ; this, together with the fact that the ( k, k +1) th and ( k +1 , k +1) th entries of ˜ M agree, shows that ˜ S k,j ≤ ˜ S k +1 ,j . We conclude that ˜ M has the increasing columns property and that ˜ H is Hoffman-London.
We now consider the case where H has some looped dominating vertices. The collection of looped dominating vertices in H forms an autmorphic similarity class. Moreover if the automorphic similarity classes of H are H 1 , . . . , H k , presented in such an order that the associated automorphic similarity matrix M satisfies the increasing columns property, then the class H k must be the class consisting of all the looped dominating vertices. To see this, note that the increasing columns property applied to the sum of all k columns of M simply says d 1 ≤ d 2 ≤ · · · ≤ d k , where d i is the common degree of the vertices in H i . Looped dominating vertices have the maximum possible degree, namely | H | , and any vertex with degree | H | is a looped dominating vertices. It follows that d k = | H | and that H k consists of looped dominating vertices.
Now once again let ˜ H be obtained from H by adding b looped dominating vertices. The automorphic similarity classes of ˜ H are ˜ H 1 , ˜ H 2 , . . . , ˜ H k , where ˜ H i = H i for i < k and ˜ H k is the union of H k and the b added looped dominating vertices. The associated automorphic similarity matrix ˜ M is obtained from M by adding b to every entry in the final column of M . In a similar manner to the previous case, it follows that ˜ M has the increasing columns property and that ˜ H is Hoffman-London.
Before presenting an application of Proposition 4.12, we discuss blow-ups of graphs.
Definition 4.13. Let H be a target graph on vertex set { v 0 , v 1 , . . . , v k } without multiple edges but perhaps with loops. Let n = ( n 0 , n 1 , . . . , n k ) be a vector of positive integers. The blow-up of H by n , denoted H n , has vertex set ⋃ k i =0 V i , where V i is a set of size n i and the V i s are pairwise
̸
disjoint. If v i ∼ H v j (with either i = j or i = j ), then there is an edge in H n between every x ∈ V i and y ∈ V j , and there are no other edges.
In other words, H n is obtained from H by 'blowing up' each vertex of H to a cluster of vertices, replacing unlooped vertices with empty graphs, looped vertices with fully looped complete graphs, and edges with complete bipartite graphs. As an example, the join of an empty graph and a fully looped complete graph is a blow-up of H ind , the target graph that encodes independent sets; see Figure 6.
Figure 6: An example of a blow-up with n = (3 , 2) where the unlooped vertex of H = H ind is labeled v 0 and the looped vertex is labeled v 1 .
<details>
<summary>Image 6 Details</summary>

### Visual Description
## Graph Diagram: Simple Graph (H) and Complex Graph (Hⁿ)
### Overview
The image displays two distinct graph theory diagrams positioned side-by-side on a white background. The left diagram is labeled "H" and depicts a simple graph with two nodes. The right diagram is labeled "Hⁿ" and depicts a more complex, densely connected graph with five nodes. Both diagrams use black dots for nodes and solid black lines for edges.
### Components/Axes
* **Diagrams:** Two separate graph diagrams.
* **Labels:**
* Left Diagram: The label "H" is positioned directly below the graph.
* Right Diagram: The label "Hⁿ" is positioned directly below the graph. The "n" is a superscript.
* **Graph Elements:**
* **Nodes (Vertices):** Represented by solid black circles.
* **Edges (Lines):** Represented by solid black lines connecting nodes.
* **Self-Loops:** Edges that connect a node to itself, represented by a curved line originating and terminating at the same node.
### Detailed Analysis
**Diagram H (Left):**
* **Structure:** A graph with 2 nodes.
* **Node Placement:** One node on the left, one node on the right.
* **Connections:** A single straight edge connects the two nodes.
* **Additional Feature:** The right node has a self-loop attached to it.
* **Spatial Grounding:** The entire graph is centered horizontally. The self-loop is attached to the right side of the right node.
**Diagram Hⁿ (Right):**
* **Structure:** A graph with 5 nodes.
* **Node Placement:** The nodes are arranged in a rough pentagonal or circular layout. There is one node on the far left, two nodes in the upper-middle area, and two nodes in the lower-middle area.
* **Connections (Edge Analysis):**
* The far-left node is connected to all four other nodes (4 edges).
* The two upper-middle nodes are connected to each other, to the far-left node, and to both lower-middle nodes.
* The two lower-middle nodes are connected to each other, to the far-left node, and to both upper-middle nodes.
* This creates a dense web of connections. The graph appears to be a complete graph on 5 vertices (K₅), where every pair of distinct nodes is connected by a unique edge.
* **Additional Features:**
* The top-right node has a self-loop.
* The bottom-right node has a self-loop.
* **Spatial Grounding:** The graph is centered horizontally. The two self-loops are on the right side of the two rightmost nodes.
### Key Observations
1. **Progression in Complexity:** There is a clear visual and structural progression from the simple graph H to the complex graph Hⁿ. H has 2 nodes and 1 edge (+1 loop), while Hⁿ has 5 nodes and 10 edges (+2 loops).
2. **Connectivity:** H is minimally connected (a single path). Hⁿ is maximally connected (a complete graph), representing a significant increase in relational density.
3. **Self-Loops:** Both graphs contain self-loops, but their number and placement differ. In H, only one node has a loop. In Hⁿ, two nodes (the rightmost ones) have loops.
4. **Label Notation:** The label "Hⁿ" uses a superscript "n", which in mathematical notation often denotes an exponent, power, or iteration. This suggests Hⁿ is derived from or related to H through some operation (e.g., graph power, tensor product, or iterative expansion).
### Interpretation
This image is a technical illustration from the field of graph theory or network science. It visually contrasts a simple, foundational graph structure (H) with a more complex, derived structure (Hⁿ).
* **What the data suggests:** The diagrams demonstrate a transformation or operation that increases both the number of nodes (from 2 to 5) and the connectivity (from a single edge to a complete graph). The superscript "n" implies this could be one instance of a generalized process (e.g., the n-th power of a graph, or the result of n iterations of a graph rewriting rule).
* **Relationship between elements:** H serves as the base case or starting point. Hⁿ represents a more advanced, interconnected state. The presence of self-loops in both suggests that the property of "reflexivity" (a node relating to itself) is preserved or intentionally added during the transformation.
* **Notable patterns:** The most striking pattern is the shift from a linear, sparse structure to a dense, web-like structure. This could model concepts like network growth, the increase of relationships in a social group, or the expansion of states in a computational system. The specific transformation from a 2-node graph to a 5-node complete graph with loops would be defined by the mathematical context from which this image is taken.
</details>
We now show, as a corollary of Proposition 4.12, that blow-ups of fully looped stars (stars augmented by putting a loop at every vertex) are Hoffman-London. We will present an application of this corollary, to the Widom-Rowlinson model from statistical physics, in Section 4.6.
Corollary 4.14. Let S ◦ t +1 be the fully looped star on t +1 vertices v 0 , v 1 , . . . , v t , with v 0 the center of the star. Let n = ( n 0 , n 1 , . . . , n t ) be any vector of positive integers. The blow-up of S ◦ t +1 by n is Hoffman-London.
Proof. We start by considering the graph H ( n 1 , . . . , n t ), which is the disjoint union of t fully looped complete graphs K ◦ 1 , . . . , K ◦ t on n 1 , . . . , n t vertices, respectively. Let S 1 , . . . , S k be the partition of { 1 , . . . , t } induced by the equivalence relation i ≡ j if and only if n i = n j , and let s i = ∑ j ∈ S i n j . Without loss of generality, we may assume that s 1 ≤ s 2 ≤ ·· · ≤ s k . For example, if ( n 1 , n 2 , n 3 , n 4 , n 5 , n 6 ) = (3 , 10 , 4 , 3 , 10 , 10), then we would take S 1 = { 3 } with s 1 = 4, S 2 = { 1 , 4 } with s 2 = 3 + 3 = 6 and S 3 = { 2 , 5 , 6 } with s 3 = 10 + 10 + 10 = 30. The automorphic similarity classes of H are H i = ⋃ j ∈ S i V ( K j ) and the corresponding automorphic similarity matrix is diag { s 1 , . . . , s k } . By our choice of ordering on the s i , this satisfies the increasing columns property.
It follows from Theorem 1.5 that H ( n 1 , . . . , n t ) is Hoffman-London, and therefore it follows from Proposition 4.12 that the graph H ( n 0 , n 1 , . . . , n t ) obtained from H ( n 1 , . . . , n t ) by adding n 0 looped dominating vertices is also Hoffman-London. But H ( n 0 , n 1 , . . . , n t ) is exactly the blow-up of S ◦ t +1 by n , and this observation completes the proof.
## 4.6 Connection to statistical physics
In this section we briefly discuss a connection between graph homomorphisms and statistical physics models, and we use some of our results to derive consequences for the partition functions of some of these models. For more thorough (though still accessible) treatments of the connection, see e.g. [1, 2, 24].
One goal of statistical physics is to understand the global behavior of models that are defined by local rules. In a hard constraint spin model , space is modeled by a graph G . The vertices of G represent sites that are occupied by at most one particle per site. Each particle has a certain spin, from among some specified collection of possible spins. The edges of G encode pairs of sites that are close enough that the particles at those sites interact. The interaction rule in a hard constraint model is that only certain pairs of spins are allowed to appear on adjacent sites. This can be encoded by a graph H whose vertices are the possible spins, with an edge between two vertices exactly when the corresponding spins are allowed to appear on adjacent sites. A valid configuration of spins on the vertices of G is thus exactly an H -coloring of G . Note that in this context it is very natural to allow H to have loops-a loop at vertex v ∈ V ( H ) encodes the fact that two particles on adjacent sites of G are allowed to both have spin v .
Typically, a hard-constraint spin model, with constraint graph H on vertex set { v 0 , . . . , v k } , comes with a vector of positive activities λ = ( λ 0 , λ 1 , . . . , λ k ) which (informally) measure how frequently particles of each spin typically occur. Here λ i is the activity associated with v i . The weight of a configuration, or equivalently the weight of an H -coloring f : V ( G ) → V ( H ), is given by
$$w ( H _ { 2 } X ) ( f ) = \sum _ { v \in V ( G ) } ^ { k } \lambda _ { v } ^ { l }$$
where note that f -1 ( v i ) is the set of vertices in G that are mapped to v i by f . The partition function of the model is the sum
$$z ^ { \prime } ( G , H ) = \sum _ { j = 1 } ^ { n } w ^ { ( i + j ) } ( f )$$
The partition function is the normalizing constant used to turn the assignment of weights into an assignment of probabilities, making the model stochastic. The partition function encodes significant information about the model.
We present three specific examples; these are the three for which we later present results.
Example 4.15. The hard-core or lattice gas model is a model of the occupation of space by atoms that have non-zero volume, with a 'hard core' around their boundary, meaning that if an atom sits at a site, no other atom can sit at any nearby site. Dating back at least to the investigations of Gaunt and Fisher and of Runnels [22, 36], this model can be represented using two spins, v out representing 'unoccupied' and v in representing 'occupied,' with the occupation rule being that unoccupied sites can be adjacent to each other, unoccupied sites can be adjacent to occupied sites, and occupied sites cannot be adjacent to each other. Since possible collections of occupied sites in this model correspond exactly with the collection of independent sets in G , the hard-core model can be encoded using the target graph H ind , where v out is the looped vertex of H ind and v in is the unlooped vertex. Traditionally, the activity associated with v out in this model is 1 and the activity associated with v in is some λ > 0. If λ is small, then the model favors sparser configurations of occupied vertices, while if λ is large, then it favors denser configurations. If λ = 1, then the model gives all configurations equal weight. See Figure 7 for an example.
Example 4.16. The Widom-Rowlinson model is a model of the occupation of space by k mutually repulsive particles. This model was introduced by Widom and Rowlinson [41] as a model of liquidvapor phase transition. There are k +1 spins, v 0 through v k . Empty space is represented by v 0 ,
Figure 7: The constraint graph encoding the hard-core model (left) and a hard-core configuration on a tree (right). If v out has activity 1 and v in has activity λ , then the weight of the illustrated configuration is λ 4 .
<details>
<summary>Image 7 Details</summary>

### Visual Description
\n
## Diagram: Graph Structures with Node Types
### Overview
The image displays two distinct graph diagrams composed of nodes (vertices) and edges (connections). The left diagram is a simple two-node graph with a self-loop, while the right diagram is a more complex, tree-like network. The nodes are differentiated by fill color: some are solid black, others are empty white circles. There is no explicit legend, but the color appears to denote a categorical property of the nodes.
### Components/Axes
* **Left Diagram:**
* **Nodes:** Two nodes.
* Left node: An empty white circle labeled **`v_out`**.
* Right node: A solid black circle labeled **`v_in`**.
* **Edges:** One straight line connecting `v_out` and `v_in`. One curved line forming a **self-loop** on the `v_out` node.
* **Right Diagram:**
* **Nodes:** A total of 11 nodes.
* **Solid Black Nodes:** 5 nodes.
* **Empty White Nodes:** 6 nodes.
* **Edges:** 10 straight lines connecting the nodes in a branching, non-cyclic structure. The central node is an empty white circle.
* **Spatial Layout:** The two diagrams are placed side-by-side. The simpler graph is on the left, and the more complex graph is on the right. There is no title, axis, or numerical data present.
### Detailed Analysis
* **Left Graph Structure:** This represents a minimal directed graph or state machine. The edge from `v_in` to `v_out` suggests a flow or transition from an input state to an output state. The self-loop on `v_out` indicates that the output state can transition to itself, representing a stable state, a recursive operation, or a feedback loop.
* **Right Graph Structure:** This is an undirected tree or network graph. It has a central hub (an empty white node) from which three main branches emanate.
* **Branch 1 (Top):** Connects to an empty white node, which then connects to a terminal solid black node.
* **Branch 2 (Left):** Connects to an empty white node, which then connects to two terminal solid black nodes.
* **Branch 3 (Bottom-Right):** Connects to a solid black node, which then connects to two terminal empty white nodes.
* **Node Color Pattern:** There is no strict rule that all leaf nodes are one color. Terminal nodes can be either black or white. The central node and several intermediate nodes are white, while some intermediate and terminal nodes are black. This suggests the color may represent a property like "active/inactive," "type A/type B," or "input/output" that is not determined solely by the node's position in the hierarchy.
### Key Observations
1. **Asymmetry in Complexity:** The image juxtaposes a fundamental graph motif (left) with a more elaborate network built from similar elements (right).
2. **Color as the Sole Differentiator:** In the absence of textual labels on the right graph, the black/white fill is the only visual cue to distinguish node types or states.
3. **Mixed Connectivity:** The right graph shows that nodes of the same color (e.g., white) can be connected to each other, and nodes of different colors are also connected. There is no visible segregation by color.
4. **Structural Role vs. Color:** A node's structural importance (e.g., being a central hub) does not correlate with a specific color. The central hub is white, but other white nodes are peripheral.
### Interpretation
This diagram likely serves as an abstract representation in fields like **graph theory, computer science, or network analysis**. It visually communicates concepts about node classification and network topology.
* **The Left Graph** acts as a **legend or key**, defining the two fundamental node types (`v_in` as black, `v_out` as white) and a basic relational motif (input feeds output, which has self-feedback).
* **The Right Graph** demonstrates how these node types can be assembled into a larger, more complex structure. It shows that the relationship between node type (color) and network position is not fixed; both black and white nodes can serve as hubs, intermediaries, or terminals.
* **The Underlying Message:** The diagram emphasizes that **function or state (indicated by color) is independent of structural position**. A node's role in the network (central, bridging, terminal) is a separate property from its intrinsic type. This is a common concept in modeling systems like neural networks (where neurons have types), social networks (where individuals have attributes), or distributed systems (where nodes have different functions).
**Language Note:** The only text present is the mathematical notation `v_out` and `v_in`, which are standard subscripts in English-language technical documents. No other language is present.
</details>
̸
and the remaining spins are used to represent k different particles. A site occupied by a particle of type i can be adjacent to empty space or to other particles of type i , but not to particles of type j for any j = i . Empty space comes with no restriction. A valid configuration of particles is thus modeled by an H -coloring of G where H is the fully looped star on vertex set { v 0 , . . . v k } with v 0 as the central vertex. A vertex of G being mapped to v 0 represents 'unoccupied' and being mapped to v i represents 'occupied by a particle of type i .' Traditionally, the activity associated with v 0 in this model is 1 and for each i > 0 the activity associated with v i is some λ i > 0. See Figure 8 for an example with two particles, denoted red and blue.
Figure 8: The constraint graph encoding the Widom-Rowlinson model (left) and a Widom-Rowlinson configuration on a tree (right). If v 0 has activity 1, v R has activity λ R , and v B has activity λ B , then the weight of the illustrated configuration is λ 3 R λ 4 B .
<details>
<summary>Image 8 Details</summary>

### Visual Description
## Diagram: Two Graph Structures
### Overview
The image displays two distinct graph diagrams side-by-side on a white background. Both diagrams consist of nodes (circles) connected by edges (lines). The nodes are colored red, white, or blue. The left diagram is a simple linear chain with self-loops, while the right diagram is a more complex, tree-like structure without self-loops.
### Components/Axes
* **Node Types:** Three distinct node types are identified by color and label.
* **Red Node:** Labeled `v_R`.
* **White Node:** Labeled `v_0`.
* **Blue Node:** Labeled `v_B`.
* **Edges:** Represented by solid black lines connecting nodes.
* **Self-Loops:** Present only in the left diagram, represented by curved lines originating from and returning to the same node.
### Detailed Analysis
#### Left Diagram (Linear Chain)
* **Structure:** A horizontal chain of three nodes.
* **Node Sequence (Left to Right):**
1. **Red Node (`v_R`):** Positioned at the far left. Has a self-loop. Connected by a straight edge to the central white node.
2. **White Node (`v_0`):** Positioned in the center. Has a self-loop. Connected by straight edges to both the red node on its left and the blue node on its right.
3. **Blue Node (`v_B`):** Positioned at the far right. Has a self-loop. Connected by a straight edge to the central white node.
* **Connections:** The graph is fully connected in a line: `v_R` — `v_0` — `v_B`. Each node also has a self-loop.
#### Right Diagram (Tree Structure)
* **Structure:** A central white node with branches extending outward, forming a tree with a depth of approximately 2-3 levels.
* **Central Node:**
* **White Node:** Positioned at the geometric center of this diagram. It is the primary hub.
* **First-Level Connections (Directly connected to the central white node):**
1. **Top-Left:** A **Red Node**.
2. **Top-Right:** A **White Node**.
3. **Bottom-Right:** A **Red Node**.
4. **Bottom-Left:** A **Blue Node**.
* **Second-Level Connections (Branching from first-level nodes):**
* From the **Top-Right White Node**:
* A **Blue Node** extends upward.
* A **Blue Node** extends to the right.
* From the **Bottom-Right Red Node**:
* A **White Node** extends to the right.
* A **Red Node** extends downward.
* **Node Count & Color Distribution (Right Diagram):**
* Red Nodes: 3
* White Nodes: 3
* Blue Nodes: 4
* Total Nodes: 10
### Key Observations
1. **Color Consistency:** The labels `v_R`, `v_0`, and `v_B` are explicitly shown only on the left diagram's nodes. However, the color coding (Red, White, Blue) is consistently applied across both diagrams, suggesting the right diagram's nodes represent instances of these same types, even without explicit labels.
2. **Structural Contrast:** The left diagram is a simple, linear, and fully self-referential (self-loops) model. The right diagram is a complex, branching, hierarchical structure with no self-loops.
3. **Connectivity Pattern:** In the right diagram, white nodes (`v_0` type) appear to act as connectors or hubs. The central white node has the highest degree (4 connections). The top-right white node also has a degree of 3 (one connection to center, two to blue nodes).
4. **Spatial Layout:** The right diagram is arranged radially around the central white node, with branches extending in multiple directions (top, bottom, left, right, and diagonals).
### Interpretation
These diagrams likely illustrate concepts from graph theory, network science, or state machine modeling. The left diagram (`v_R`, `v_0`, `v_B` with self-loops) could represent a simple Markov chain or a basic state transition model where each state has a probability of remaining in itself (the self-loop) and transitioning to adjacent states.
The right diagram appears to be an **instantiation or a more complex network built from the component types defined on the left**. It demonstrates how the simple `v_R`, `v_0`, `v_B` elements can be interconnected to form a larger, non-linear system. The central white node (`v_0` type) playing a hub role suggests it might be a central server, a common protocol, or a shared resource in a network topology. The branching pattern, where red and blue nodes are often leaves (endpoints), could indicate client devices or terminal states in a process flow.
The absence of self-loops in the complex tree might imply that in this larger system, the focus is on inter-node communication rather than individual node persistence. The consistent color coding is crucial for mapping the simple model's semantics onto the complex structure, allowing one to infer the function or type of each node in the larger network based on its color.
</details>
Example 4.17. The capacity C independent set or capacity C multicast model was introduced in Section 4.4. Recall that configurations in this model are encoded as H -colorings by taking H = H C to be the graph on vertex set { 0 , 1 , . . . , C } with edge set {{ a, b } : a + b ≤ C } . At C = 1, H C reduces to the hard-core model of Example 4.15. Traditionally, in this model the activity assigned to vertex i is λ i for some λ > 0. See Figure 9 for an example.
The problem of maximizing or minimizing the partition function of a weighted hard-constraint spin model with constraint graph H is very closely related to enumeration of H n -colorings, as we shall now see. Consider a weighted hard-constraint spin model on a graph G with H on vertex set
Figure 9: The constraint graph encoding the capacity 3 independent set model (left) and a capacity 3 independent set on a tree (right). If spin i has activity λ i , then the configuration has weight λ 12 .
<details>
<summary>Image 9 Details</summary>

### Visual Description
## Node-Link Diagrams: Two Graph Structures
### Overview
The image displays two separate, hand-drawn-style node-link diagrams (graphs) side-by-side on a white background. Both diagrams consist of circular nodes (vertices) connected by straight lines (edges). The nodes are colored in four distinct colors: white, red, blue, and black. The left diagram includes numerical labels for its nodes, while the right diagram does not.
### Components/Axes
**Left Diagram:**
* **Nodes:** 4 nodes, labeled with numbers 0, 1, 2, and 3.
* **Node Colors & Positions:**
* Node 0: White, positioned top-left.
* Node 1: Red, positioned top-right.
* Node 2: Blue, positioned bottom-right.
* Node 3: Black, positioned bottom-left.
* **Edges (Connections):**
* A loop (self-connection) on Node 0.
* A loop on Node 1.
* An edge between Node 0 and Node 1.
* An edge between Node 0 and Node 2.
* An edge between Node 0 and Node 3.
* An edge between Node 1 and Node 2.
**Right Diagram:**
* **Nodes:** 11 nodes total. No numerical or textual labels are present.
* **Node Colors & General Layout:** The graph has a central structure with branches extending outward.
* **Central Cluster:** A red node is connected to two blue nodes. One blue node is to its left, the other to its right and slightly down.
* **Upper Branch:** From the right-side blue node, an edge goes up to a white node. This white node connects further up to a black node and to the right to a blue node.
* **Lower-Right Branch:** From the right-side blue node, an edge goes down-right to another white node.
* **Lower-Left Branch:** From the left-side blue node, an edge goes down-left to a red node.
* **Far Lower Branch:** From the lower-right white node, an edge goes down to a final red node.
* **Edges:** All connections are simple, straight lines with no arrows, indicating undirected edges.
### Detailed Analysis
**Left Diagram Analysis:**
* **Trend/Structure:** This is a connected graph with a central hub (Node 0) connected to all other nodes. It contains two self-loops (on nodes 0 and 1) and forms a triangle between nodes 0, 1, and 2.
* **Node Degree (Number of Connections):**
* Node 0: Degree 5 (loop counts as 1, plus edges to 1, 2, 3).
* Node 1: Degree 3 (loop, plus edges to 0, 2).
* Node 2: Degree 2 (edges to 0, 1).
* Node 3: Degree 1 (edge to 0 only).
**Right Diagram Analysis:**
* **Trend/Structure:** This is a tree-like structure (no visible cycles) with a central red node acting as a primary junction. The graph branches out in multiple directions. The longest path appears to be from the top-most black node down to the bottom-most red node.
* **Color Distribution:** There are 3 red nodes, 3 blue nodes, 4 white nodes, and 1 black node.
### Key Observations
1. **Asymmetry:** Both graphs are asymmetric in their layout and connectivity.
2. **Color Coding:** Colors are used systematically to differentiate nodes but without an explicit legend. In the left graph, colors are paired with labels. In the right, color is the only identifier.
3. **Self-Loops:** Only the left diagram contains self-loops (edges from a node to itself).
4. **Complexity:** The right diagram is more complex, with a higher node count and a branching, non-linear structure compared to the more compact, hub-based left diagram.
5. **Spatial Grounding:** In the right diagram, the single black node is the highest (top-most) element. The lowest element is a red node at the bottom-center. The central red node is slightly right of the absolute center of its diagram.
### Interpretation
These diagrams likely represent abstract relational structures, such as **state machines, network topologies, or social graphs**.
* **Left Diagram:** Could model a system with four states (0-3). The loops on states 0 and 1 suggest these are "stable" or "idling" states. The high connectivity of Node 0 indicates it is a primary or master state. The triangle (0-1-2) suggests a tightly coupled subsystem.
* **Right Diagram:** Represents a hierarchical or branching network. The central red node is a critical connector or router. The color pattern might indicate node types or roles (e.g., red=router, blue=switch, white=client, black=external gateway). The tree structure suggests information or control flows from the central hub outwards to peripheral nodes.
* **Relationship Between Diagrams:** They are presented together for comparison, highlighting different graph topologies: a **small, dense, labeled graph with cycles** versus a **larger, sparse, unlabeled tree**. The use of the same color palette across both suggests a shared conceptual framework for classifying nodes, even if the specific meaning isn't provided. The absence of a legend is a significant omission for full technical interpretation.
</details>
{ v 0 , v 1 , . . . , v k } encoding the constraints on the spins. Let λ = ( λ 0 , λ 1 , . . . , λ k ) be an assignment of positive rational activities to the spins. Let integer M be such that Mλ i = n i ∈ N for each i . For any H -coloring f : V ( G ) → V ( H ) we have, from equation (14),
$$M ^ { n } _ { u } ( H \lambda ) ( f ) = \sum _ { i = 0 } ^ { k } | r _ { u } ^ { - 1 } ( w _ { i } ) |$$
where n is the number of vertices of G . Summing over all f ∈ Hom( G,H ) and recalling the definition of the blow-up graph, we get
$$M ^ { 2 } Z ^ { 2 } ( G , H ) = h o n ( G , H ^ { T } ).$$
This idea leads to the following lemma.
Lemma 4.18. Let H be an arbitrary graph on vertex set { v 0 , v 1 , . . . , v k } , perhaps with loops but without multiple edges. If G 1 and G 2 are graphs on the same number of vertices with the property that hom( G 1 , H n ) ≤ hom( G 2 , H n ) for every vector n of positive integers of length k +1 , then for every assignment of real positive (not necessarily rational) activities λ to the vertices of H , we have Z λ ( G 1 , H ) ≤ Z λ ( G 2 , H ) .
Proof. When the λ i s are all rational, the result follows directly from equation (15). If ( λ i ) k i =0 is not a rational ( k + 1)-tuple, we can consider a sequence of rational ( k + 1)-tuples that converge to ( λ i ) k i =0 . The validity of Lemma 4.18, together with the continuity of Z λ ( G,H ), allows us to conclude that Z λ ( G 1 , H ) ≤ Z λ ( G 2 , H ) in this case.
The question of which graphs in various families maximize and minimize the partition function of various hard-constraint models has a well-developed literature; see e.g. [9, 12, 38] for the WidomRowlinson model, [26, 44, 14] for the hard-core model, and [11, 12, 13, 20, 17, 37] for general models.
We can use Lemma 4.18 to draw conclusions about the hard-core model, the Widom-Rowlinson model, and the capacity C independent set model on trees.
Theorem 4.19. Let n ≥ 1 be arbitrary and let T n ∈ T n .
- (a) For the hard-core model with λ = (1 , λ ) for arbitrary λ > 0 , we have
$$z ^ { \lambda } ( P _ { n } , H _ { ind } ) \leq z ^ { \lambda } ( T _ { n }$$
- (b) For the k -state Widom-Rowlinson model with λ = (1 , λ 1 , · · · , λ k ) for arbitrary λ i > 0 , we have
$$z ^ { \prime } ( P _ { n } , H _ { WR } ( k ) ) \leq z ^ { \prime } ( T _ { n } ,$$
- (c) For the capacity C independent set model with λ = (1 , λ, · · · , λ C ) for arbitrary λ > 0 , we have
$$z ^ { \lambda } ( P _ { n } , H C ) \leq z ^ { \lambda } ( T _ { n }$$
Proof. All three upper bounds follow from Theorem 1.1 and Lemma 4.18. For the lower bounds, it suffices to establish that arbitrary positive integer blow-ups of H ind , H WR( k ) , and H C are HoffmanLondon, and then appeal to Lemma 4.18. For H ind and H C , we use that arbitrary blow-ups of loop threshold graphs are still loop threshold and appeal to Theorem 4.10, and we use Corollary 4.14 for H WR( k ) .
## 5 Classification of target graphs with at most three vertices
There are twenty-eight graphs (with or without loops, not necessarily connected) on three or fewer vertices. These are shown in Figure 10. For each H on this list, each n , and each T n ∈ T n , it holds that hom( P n , H ) ≤ hom( T n , H ). The collection of results we have presented thus far establish this fact, and also allow us, for each H , to completely specify which other T n ∈ T n , if any, also minimize the count of H -colorings. As hom( T 1 , H ) = | V ( H ) | for every H , we confine attention to trees on n ≥ 2 vertices.
We begin with those cases in which each tree on n ≥ 2 vertices admits the same number of H -colorings, using Observation 2.2 without additional mention to discuss sets of target graphs that behave identically. For unions of isolated vertices, namely H 1 , H 3 , and H 9 , there are no H -colorings of any such tree. We apply Observation 2.3 to those H whose non-isolated components are regular to find our trees have unique H 2 -, H 4 -, and H 10 -colorings, two H 6 - and H 13 -colorings, 2 n H 8 - and H 17 -colorings, 3 n H 28 -colorings, and finally 3 · 2 n -1 H 23 - and H 25 -colorings. Then we use Observation 2.1 to consider target graphs with regular components to see each of these trees have two H 5 -colorings, two H 11 -colorings, three H 12 -colorings, three H 14 -colorings, and 2 n +1 H 18 -colorings.
Because H 19 is a complete bipartite graph with unequal parts, Theorem 2.6 tells us the trees minimizing hom( T n , H 19 ) are precisely those with a balanced bipartition.
Theorem 4.1 demonstrates that regular graphs to which looped dominating vertices are added are strongly Hoffman-London, including H 7 , H 21 , H 24 , and H 26 . Furthermore, H 15 is strongly Hoffman-London due to Theorem 4.1 and Observation 2.2 and H 16 is strongly Hoffman-London due to Theorem 4.1 and Observation 2.1.
Note that H 20 × K 2 = P 6 . That hom( P n , H 20 ) ≤ hom( T n , H 20 ) for all n and T n ∈ T n now follows from Observation 2.5 and Theorem 4.2, and that hom( P n , H 20 ) < hom( T n , H 20 ) for all n and T n ∈ T n different from P n follows from Theorem 4.6. Finally, H 22 and H 27 are loop threshold and thus strongly Hoffman-London by Theorem 4.10.
For convenience, we summarize these results in Table 1. In cases where all trees on n vertices admit the same number of H -colorings, that number is given.
Figure 10: All target graphs on up to three vertices.
<details>
<summary>Image 10 Details</summary>

### Visual Description
\n
## Diagram: Catalog of Small Graphs (H₁–H₂₈)
### Overview
The image displays a 7-row by 4-column grid containing 28 distinct, simple undirected graphs. Each graph is composed of nodes (solid black dots), edges (straight lines connecting nodes), and loops (circles attached to a single node). Every graph is labeled below its diagram with the notation "H" followed by a subscript number, sequentially from H₁ to H₂₈. The grid is framed by thin black lines, separating each graph into its own cell.
### Components/Axes
* **Grid Structure:** 7 rows x 4 columns.
* **Graph Elements:**
* **Nodes:** Represented as solid black circles (●).
* **Edges:** Represented as straight black lines connecting two nodes.
* **Loops:** Represented as open circles (○) attached to a single node.
* **Labels:** Each graph has a unique identifier placed directly below it, formatted as `H` with a numerical subscript (e.g., H₁, H₂, ..., H₂₈).
* **Spatial Layout:** The graphs are arranged in a strict grid. The legend (the labeling system) is consistently placed at the bottom-center of each cell.
### Detailed Analysis
The graphs are presented in order of increasing complexity. Below is a breakdown by row:
**Row 1 (H₁-H₄):**
* **H₁:** A single isolated node. (0 edges, 0 loops)
* **H₂:** A single node with one loop attached. (0 edges, 1 loop)
* **H₃:** Two isolated nodes, not connected. (0 edges, 0 loops)
* **H₄:** Two nodes. One node is isolated, the other has a loop. (0 edges, 1 loop)
**Row 2 (H₅-H₈):**
* **H₅:** Two nodes, each with a loop. No edge between them. (0 edges, 2 loops)
* **H₆:** Two nodes connected by a single edge. (1 edge, 0 loops)
* **H₇:** Two nodes connected by an edge. One node has a loop. (1 edge, 1 loop)
* **H₈:** Two nodes connected by an edge. Both nodes have loops. (1 edge, 2 loops)
**Row 3 (H₉-H₁₂):**
* **H₉:** Three isolated nodes. (0 edges, 0 loops)
* **H₁₀:** Three nodes. Two are isolated, one has a loop. (0 edges, 1 loop)
* **H₁₁:** Three nodes. One is isolated, two each have a loop. (0 edges, 2 loops)
* **H₁₂:** Three nodes, each with a loop. No edges. (0 edges, 3 loops)
**Row 4 (H₁₃-H₁₆):**
* **H₁₃:** Three nodes. Two are connected by an edge, one is isolated. (1 edge, 0 loops)
* **H₁₄:** Three nodes. Two are connected by an edge, the isolated node has a loop. (1 edge, 1 loop)
* **H₁₅:** Three nodes. Two are connected by an edge, and one of those connected nodes has a loop. The third node is isolated. (1 edge, 1 loop)
* **H₁₆:** Three nodes. Two are connected by an edge, and both of those nodes have loops. The third node is isolated. (1 edge, 2 loops)
**Row 5 (H₁₇-H₂₀):**
* **H₁₇:** Three nodes. Two are connected by an edge, and both have loops. The third node is isolated and also has a loop. (1 edge, 3 loops)
* **H₁₈:** Three nodes. Two are connected by an edge, and both have loops. The third node is isolated and has a loop. (Identical structure to H₁₇, visually similar placement). (1 edge, 3 loops)
* **H₁₉:** Three nodes forming a path: Node A connected to Node B, which is connected to Node C. (2 edges, 0 loops)
* **H₂₀:** Three nodes forming a path (A-B-C). Node A has a loop. (2 edges, 1 loop)
**Row 6 (H₂₁-H₂₄):**
* **H₂₁:** Three nodes forming a path (A-B-C). Node B (the middle node) has a loop. (2 edges, 1 loop)
* **H₂₂:** Three nodes forming a path (A-B-C). Node C has a loop. (2 edges, 1 loop)
* **H₂₃:** Three nodes forming a path (A-B-C). Nodes A and B have loops. (2 edges, 2 loops)
* **H₂₄:** Three nodes forming a path (A-B-C). Nodes A and C have loops. (2 edges, 2 loops)
**Row 7 (H₂₅-H₂₈):**
* **H₂₅:** Three nodes all connected to each other, forming a triangle (a complete graph K₃). (3 edges, 0 loops)
* **H₂₆:** A triangle (K₃). One node has a loop. (3 edges, 1 loop)
* **H₂₇:** A triangle (K₃). Two nodes have loops. (3 edges, 2 loops)
* **H₂₈:** A triangle (K₃). All three nodes have loops. (3 edges, 3 loops)
### Key Observations
1. **Progressive Complexity:** The catalog shows a clear progression from the simplest possible graph (a single node, H₁) to a more complex, fully connected graph with loops on all vertices (H₂₈).
2. **Systematic Enumeration:** The graphs appear to be a systematic enumeration of all possible distinct (non-isomorphic) simple graphs on up to 3 vertices, including the consideration of loops. The sequence explores all combinations of connectivity and loop placement.
3. **Disconnected vs. Connected:** The first four rows contain many disconnected graphs. From H₁₉ onward, all graphs are connected.
4. **Loop Placement:** Loops are treated as a feature that can be added independently to any node in any graph configuration.
### Interpretation
This image serves as a **visual reference catalog for small graph structures** in graph theory. It is likely used for educational purposes to illustrate fundamental concepts such as:
* **Graph Components:** Nodes, edges, and loops.
* **Connectivity:** The difference between connected and disconnected graphs.
* **Graph Isomorphism:** The distinct graphs (H₁-H₂₈) represent non-isomorphic classes—meaning no two graphs in the set can be transformed into each other simply by relabeling the vertices.
* **Enumeration:** It demonstrates the process of listing all possible graphs for a small number of vertices (n=1, 2, and 3), including the allowance of loops.
The methodical layout and labeling suggest this figure is from a textbook, lecture note, or reference material designed to provide a concrete, visual foundation for abstract graph-theoretic concepts. The progression allows a learner to see how complexity builds from the most basic element.
</details>
## 6 Open problems
We conclude with a few questions that remain open and which we find of interest.
In addition to Problem 1.2, Csikv´ ari and Lin propose a weaker variant in which 'all n ' is replaced by 'all sufficiently large n .' To consider this version of the problem, we introduce the
Table 1: Classification of tree minimizers of hom( T n , H ) for n ≥ 2.
| Identifier | Minimizer | Identifier | Minimizer | Identifier | Minimizer |
|--------------|------------------|--------------|---------------------|--------------|--------------------------|
| H 1 | All trees (0) | H 11 | All trees (2) | H 20 | Paths |
| H 2 | All trees (1) | H 12 | All trees (3) | H 21 | Paths |
| H 3 | All trees (0) | H 13 | All trees (2) | H 22 | Paths |
| H 4 | All trees (1) | H 14 | All trees (3) | H 23 | All trees (3 · 2 n - 1 ) |
| H 5 | All trees (2) | H 15 | Paths | H 24 | Paths |
| H 6 | All trees (2) | H 16 | Paths | H 25 | All trees (3 · 2 n - 1 ) |
| H 7 | Paths | H 17 | All trees (2 n ) | H 26 | Paths |
| H 8 | All trees (2 n ) | H 18 | All trees (2 n +1) | H 27 | Paths |
| H 9 | All trees (0) | H 19 | Trees with balanced | H 28 | All trees (3 n ) |
| H 10 | All trees (1) | | bipartitions | | |
following terminology.
Definition 6.1. We say a target graph H is eventually Hoffman-London if for all sufficiently large n ,
$$I _ { \Delta } ( P _ { 1 } , H ) = m i n h o n ( C _ { n } H _ { n } )$$
If it holds that hom( P n , H ) < hom( T n , H ) for all sufficiently large n and T n ∈ T n \ { P n } , we say that H eventually strongly Hoffman-London .
At present, no target graphs are known to be eventually Hoffman-London but not HoffmanLondon (nor eventually strongly Hoffman-London but not strongly Hoffman-London). In other words, there are no known target graphs H or which there is a non-path tree T n admitting fewer H -colorings than P n for a specific n but for which the path minimizes hom( T n , H ) when n is sufficiently large. To this end, we introduce the following problem.
Problem 6.2. Does there exist a target graph H and integers α < β such that
$$I o n ( P _ { 2 } , H ) > m i n h o u t ( C O _ { 3 } \cdot H _ { 2 } )$$
$$I _ { \Delta } ( P , I ) = \frac { i _ { \Delta } I _ { 0 } ( T , I ) } { I _ { 0 } I _ { 0 } }$$
but for each n ≥ β ?
Theorem 4.6 proves that paths on an even number of vertices other than P 2 are strongly Hoffman-London. As mentioned, Theorem 2.6 demonstrates that the case is more complicated for P 3 as the set of trees with minimal P 3 -colorings is neither just the path nor all trees. For paths with a larger odd number of vertices, the question remains open.
Problem 6.3. For m ≥ 2, classify the T ∈ T n that minimize hom( T, P 2 m +1 ).
Theorem 2.6 shows that K a,b is Hoffman-London, but it remains open whether complete multipartite graphs H with more than two parts need be Hoffman-London.
Problem 6.4. Let r ≥ 3 and H be a complete r -partite graph. Does it follow that H is HoffmanLondon?
Note that K a 1 ,...,a q -colorings may be viewed as weighted q -colorings, so this question is equivalent to asking whether there exist weights such that some tree on n vertices has fewer weighted q -colorings than P n .
As discussed in Section 4.2, we proved that each graph in the family of H ( a, b, ℓ ), in which each vertex of a b -clique is replaced with a bouquet of ℓ a -cliques, is Hoffman-London except in the case a ≥ 3 , b = 1, and ℓ ≥ 2.
Problem 6.5. For which values of a ≥ 3 and ℓ ≥ 2, if any, is H ( a, 1 , ℓ ) Hoffman-London?
Finally, we proved in Theorem 4.10 that loop threshold graphs are Hoffman-London. There are three closely related classes of graphs for which it would be interesting to prove an analogous result:
1. Threshold graphs: the graphs in the smallest family containing K 1 that is closed under adding isolated vertices and (unlooped) dominating vertices.
2. Quasi-loop threshold graphs: the graphs in the smallest family containing K 1 and K ◦ 1 that is closed under adding isolated vertices, adding looped dominating vertices, and taking disjoint unions.
3. Quasi-threshold graphs: the graphs in the smallest family containing K 1 that is closed under adding isolated vertices, adding (unlooped) dominating vertices, and taking disjoint unions.
There are threshold (and so quasi-threshold) graphs on four vertices that the methods and results discussed in this paper cannot handle, such as the triangle with a pendant edge. There are also quasi-loop threshold graphs on four vertices whose status remains open, including the graph obtained from the union of a loop and two isolated vertices by adding a dominating vertex.
## Acknowledgments
This work was started at the workshop 'Graph Theory: structural properties, labelings, and connections to applications' hosted by the American Institute of Mathematics (AIM), Pasedena CA, July 22-July 26, 2024. The authors thank AIM and the organizers of the workshop for facilitating their collaboration.
## References
- [1] Graham R. Brightwell and Peter Winkler, Graph homomorphisms and phase transitions, Journal of Combinatorial Theory Series B 77 (1999), 415-435.
- [2] Graham R. Brightwell and Peter Winkler, Hard constraints and the Bethe lattice: adventures at the interface of combinatorics and statistical physics, Proceedings of the International Congress of Mathematicians, Vol. III (Beijing, 2002), 605-624. Higher Education Press, Beijing, 2002.
- [3] George D. Birkhoff, A Determinant Formula for the Number of Ways of Coloring a Map, Annals of Mathematics 14 (1912/1913), 42-46.
- [4] G. D. Birkhoff and D. C. Lewis, Chromatic Polynomials, Transactions of the American Mathematical Society 60 (1946), 355-451.
- [5] P´ eter Csikv´ ari, On a poset of trees, Combinatorica 30 (2010), 125-137.
- [6] P´ eter Csikv´ ari and Zhicong Lin, Graph homomorphisms between trees, Electronic Journal of Combinatorics 21 (2014), article #P4.9.
- [7] P´ eter Csikv´ ari and Zhicong Lin, Homomorphisms of trees into a path, SIAM Journal of Discrete Mathematics 29 (2015), 1406-1422.
- [8] Vacl´ av Chv´ atal and Peter L. Hammer, Aggregation of inequalities in integer programming, in Studies in Integer Programming, P. L. Hammer, E. L Johnson, B. H. Korte and G. L. Nemhauser editors, (Proceedings of the Workshop on Programming, Bonn, 1975), Annals of Discrete Mathematics 1 (1977), 145-162.
- [9] Emma Cohen, Will Perkins, and Prasad Tetali, On the Widom-Rowlinson occupancy fraction in regular graphs, Combinatorics, Probability & Computing 26 (2017), 183-194.
- [10] Jonathan Cutler, Coloring graphs with graphs: a survey, Graph Theory Notes of New York 63 (2012), 7-16.
- [11] Jonathan Cutler and Nicholas Kass, Homomorphisms into loop-threshold graphs, Electronic Journal of Combinatorics 27 (2020), Article Number P2.38.
- [12] Jonathan Cutler and A. J. Radcliffe, Extremal graphs for homomorphisms, Journal of Graph Theory 67 (2011), 261-284.
- [13] Jonathan Cutler and A. J. Radcliffe, Extremal Graphs for Homomorphisms II, Journal of Graph Theory 76 (2014), 42-59.
- [14] Ewan Davies, Matthew Jenssen, Will Perkins, and Barnaby Roberts, Independent sets, matchings, and occupancy fractions, Journal of the London Mathematical Society 96 (2017), 47-66.
- [15] John Engbers and David Galvin, Extremal H-colorings of trees and 2-connected graphs, Journal of Combinatorial Theory Series B 122 (2017), 800-814.
- [16] Jon Folkman, Regular Line-Symmetric Graphs, Journal of Combinatorial Theory 3 (1967), 215-232.
- [17] David Galvin, Bounding the partition function of spin systems, Electronic Journal of Combinatorics 13 (2006), #R72.
- [18] David Galvin, Fabio Martinelli, Kavita Ramanan, and Prasad Tetali, The multistate hard core model on a regular tree, SIAM Journal on Discrete Mathematics 25 (2011), 894-915.
- [19] David Galvin, Emily McMillon, JD Nir, and Amanda Redlich, Long paths need not minimize H -colorings among trees, arXiv:2510.18770.
- [20] David Galvin and Prasad Tetali, On weighted graph homomorphisms, Discrete Mathematics and Theoretical Computer Science 63 (2004), 97-104.
- [21] David Galvin, Greyson Wesley, and Bailee Zacovic, Enumerating threshold graphs and some related graph classes, Journal of Integer Sequences 25 (2022), Article 22.2.7.
- [22] David S. Gaunt and Michael E. Fisher, Hard-Sphere Lattice Gases. I. Plane-Square Lattice, Journal of Chemical Physics 43 (1965), 2840-2863.
- [23] Chris Godsil and Gordon Royle, Algebraic Graph Theory, Springer, 2001.
- [24] P. de la Harpe and V. F. R. Jones, Graph Invariants Related to Statistical Mechanical Models: Examples and Problems, Journal of Combinatorial Theory, Series B 57 (1993), 207-227.
- [25] A. J. Hoffman, Three observations on nonnegative matrices, Journal of Research of the National Bureau of Standards 71B (1967), 39-41.
- [26] Jeff Kahn, An Entropy Approach to the Hard-Core Model on Bipartite Graphs, Combinatorics, Probability & Computing 10 (2001), 219-237.
- [27] Julian Keilson and Adri Kester, Monotone matrices and monotone Markov processes, Stochastic Processes and their Applications 5 (1977), 231-241.
- [28] A. K. Kelmans, On graphs with randomly deleted edges, Acta Mathematica Academiae Scientiarum Hungaricae 37 (1981), 77-88.
- [29] A. M. Leontovich, The number of mappings of graphs, the ordering of graphs and the Muirhead theorem, Problemy Peredachi Informatsii 25 (1989), 91-104; translation in Problems Inform. Transmission 25 (1989), 154-165.
- [30] N. Linial, Legal colorings of graphs, Combinatorica 6 (1986), 49-54.
- [31] David London, Two inequalities in nonnegative symmetric matrices, Pacific Journal of Mathematics 16 (1966), 515-536.
- [32] L´ aszl´ o Lov´ asz, Large Networks and Graph Limits, American Mathematical Society Colloquium Publications volume 60, Providence, Rhode Island, 2012.
- [33] A. E. Mazel and Yu. M. Suhov, Random surfaces with two-sided constraints: an application of the theory of dominant ground states, Journal of Statistical Physics 64 (1991), 111-134.
- [34] Helmut Prodinger and Robert F. Tichy, Fibonacci numbers of graphs, Fibonacci Quarterly 20 (1982), 16-21.
- [35] Kavita Ramanan, Anrivan Sengupta, Ilze Ziedins, and Partha Mitra, Markov random field models of multicasting in tree networks, Advances in Applied Probability 34 (2002), 1-27.
- [36] L. K. Runnels, Hard-Square Lattice Gas, Physical Review Letters 15 (1965), 581-584.
- [37] Ashwin Sah, Mehtaab Sawhney, David Stoner, and Yufei Zhao, A reverse Sidorenko inequality, Inventiones mathematicae 221 (2020), 665-711.
- [38] Luke Sernau, Graph operations and upper bounds on graph homomorphism counts, Journal of Graph Theory 87 (2018), 149-163.
- [39] A. F. Sidorenko, Proof of London's assumption on sums of elements of nonnegative matrices (in Russian), Matematicheskie Zametki 38 (1985), 376-377, 475; translation in Proof of London's conjecture on sums of elements of positive matrices, Mathematical notes of the Academy of Sciences of the USSR 38 (1985), 716-717.
- [40] Alexander Sidorenko, A partially ordered set of functionals corresponding to graphs, Discrete Mathematics 131 (1994), 263-277.
- [41] B. Widom and J. S. Rowlinson, New Model for the Study of Liquid-Vapor Phase Transitions, Journal of Chemical Physics 52 (1970), 1670-1684.
- [42] Herbert S. Wilf, Backtrack: An O (1) expected time algorithm for the graph coloring problem, Information Processing Letters 18 (1984), 119-121.
- [43] George Clifton Wingard, Properties and applications of the Fibonacci polynomial of a graph, Ph.D. thesis, University of Mississippi (1995).
- [44] Yufei Zhao, The Number of Independent Sets in a Regular Graph, Combinatorics, Probability & Computing 19 (2010), 315-320.
- [45] Yufei Zhao, Extremal regular graphs: independent sets and graph homomorphisms, American Mathematical Monthly 124 (2017), 827-843.