Introduction
Graphs provide a powerful mathematical framework for modelling complex systems, from molecular structures to social networks. In many physical and geometric problems, nodes represent particles, and edges encode interactions, often acting like springs. This perspective aligns naturally with Geometric Deep Learning, where learning algorithms leverage graph structures to capture spatial and relational patterns.
Understanding energy functions and the forces derived from them is fundamental to modelling such systems. In physics and computational chemistry, harmonic potentials, which penalise deviations from equilibrium positions, are widely used to describe elastic networks, protein structures, and even diffusion processes. The Laplacian matrix plays a key role in these formulations, linking energy minimisation to force computations in a clean and computationally efficient way.
By formalising these interactions using matrix notation, we gain not only a compact representation but also a foundation for more advanced techniques such as Langevin dynamics, normal mode analysis, and graph-based neural networks for physical simulations.
Harmonic Energy
Let \(G = (V, E)\) be an undirected graph, where \(V = {1, \dots, N}\) is the set of \(N\) nodes, each associated with a position vector \(\mathbf{x}_i \in \mathbb{R}^d\), and \( E \subseteq V \times V \) is the set of edges connecting pairs of nodes. The structure of the graph is encoded in the adjacency matrix \( \mathbf{A} \), where \( A{ij} = 1 \) if nodes \( i \) and \( j \) are connected, and 0 otherwise. The degree matrix \( \mathbf{D} \) is a diagonal matrix where \( D_{ii} = \sum_{j} A_{ij} \) gives the degree of each node. The graph Laplacian is then defined as \( \mathbf{L} = \mathbf{D} – \mathbf{A} \), which governs energy and force computations. Let \(\mathbf{X}\) as the \(N\times d\) matrix whose \(i\) th row is \(\mathbf{x}_i^\top\):
The harmonic (spring-like) energy of this system is:
Hence, the harmonic energy can be expressed succinctly in terms of the graph Laplacian, giving us a compact way of defining the total energy of the system.
Energy to Forces
Using the definition of total energy as a function of the Laplacian, we may express the force on node \(i\) as the derivative with respect to the position of each node
For a specific pair of connected nodes \( (i, j) \), we expand \( L_{ij} = D_{ij} – A_{ij} \), where we can show that the forces on node \( i \) due to node \( j \) and vice-versa are antisymmetric.
Near Equilibrium
For each edge \(((i,j)\), the bond energy is given by
Where
- \(k_{ij}\) is the bond (force) constant,
- \(l_{ij}\) is the equilibrium bond length, and
- \(\mathbf{x}_i,\mathbf{x}_j\in\mathbb{R}^d\) are the positions of nodes \(i\) and \(j\)
Define the bond stretch (or strain) as \(\epsilon_{ij} \;=\; ||\mathbf{x}_i-\mathbf{x}_j|| – l_{ij}\)
Note that the equilibrium condition is defined \emph{per bond} (i.e. for the pair \((i,j)\), the bond is at equilibrium if \(\epsilon_{ij} = 0\). For small bond strains (i.e. when \(\epsilon_{ij}\) is small) we can linearize the energy via Taylor’s expansion around the equilibrium point. The Taylor expansion of \(f(s)\) about \(s=l_{ij}^2\) is
Then the energy per bond becomes
For configurations near equilibrium:
To express \(\epsilon_{ij}\) in terms of the relative positions, note that the bond stretch is naturally measured along the bond direction. Define the equilibrium unit vector for bond \(i,j)\), where \(\mathbf{x}_i^*\) and \(\mathbf{x}_j^*\) are any positions satisfying \(||\mathbf{x}_i^*-\mathbf{x}_j^*||=l_{ij}\), we can linearly approximate \(\epsilon_{ij}\)
Since only the differences \((\mathbf{x}_i-\mathbf{x}_j)\) matter, the energy is naturally a function of the bond deformations. Collecting these into a quadratic form over all edges yields a weighted Laplacian operator. In matrix form we can write
where \(\delta\mathbf{X}\) encodes the deviations of bond differences from their equilibrium values and \(\mathbf{L}’\) is the block weighted Laplacian defined by
This final form expresses the energy solely in terms of bond deformations, which depend only on the differences between pairs of node positions, and does not require an absolute displacement for each node.
Conclusion
The harmonic energy framework presented here is more than just a mathematical curiosity; it has direct applications in physics, chemistry, and machine learning. By deriving forces from energy functions in a graph-theoretic setting, we can simulate molecular dynamics, analyse structural stability, and even develop graph-based learning models for physical systems.
One immediate extension of this work is in Langevin dynamics, where forces derived from harmonic potentials are combined with stochastic noise and damping to model the thermal motion of particles. This is crucial for studying protein folding, diffusion in complex environments, and simulating soft materials. Additionally, these formulations are integral to normal mode analysis, which helps in understanding vibrational properties in physics and materials science.
Beyond physics, these ideas also connect with graph-based neural networks that incorporate geometric constraints, such as those used in drug discovery, 3D shape analysis, and biomechanical simulations. By combining geometric deep learning with physically inspired models, we can develop data-driven approaches that remain grounded in fundamental principles of mechanics and thermodynamics.