Recursion

How simple rules create infinite meaning

What is recursion

A woman holding an object that contains a smaller image of her holding an identical object.

Recursion is a process in which something refers to itself in order to continue. Instead of moving forward in a straight line, a recursive process turns inward, applying the same rule again and again, thus each step depending on a previous version of itself.

Recursion is often confused with repetition, but the two are not the same. Repetition copies the same action without change. Recursion, on the other hand, carries information forward. Each iteration is shaped by what came before it. Because of this, recursion can produce complexity from simplicity, and structure from a single rule.

Every recursive process requires a stopping point, known as a base case. Without it, recursion goes into infinity. This balance between continuation and termination is what makes recursion powerful, but also fragile.

I find recursion interesting because it appears everywhere once you start looking for it: in mathematics, symbols, natural forms, and systems of feedback. It feels both logical and poetic, a way of understanding how patterns persist, evolve, and sometimes trap themselves. I think that recursion is not just a technical concept, but a way of thinking about how things return to themselves and evolve themselves based on previous events.

Going forward, I will present some recursion examples I found interesting.

Ouroboros

A image of a snake eating its own tail.

The ouroboros is an ancient symbol depicting a serpent consuming its own tail. Found across many cultures and historical periods, it has been used to represent cycles, eternity, destruction and renewal. The image is simple, but its meaning is dense: the beginning and the end collapse into the same form.

As a symbol, the ouroboros is a visual metaphor for recursion. The snake’s action produces itself as its own result. There is no external input and no forward movement - only continuation through self-reference. Like a recursive function without a base case, it looping endlessly.

What makes the ouroboros especially interesting to me is that it captures both the elegance and the danger of recursion. The loop can symbolize balance and regeneration, but it can also suggest stagnation or self-consumption. Without interruption or change, the process never resolves. It simply repeats itself.

The ouroboros does not go forward, it survives by using itself.

Golden Ratio

A video of the golden spiral, where the shape is infinitely repeated when magnified.

The golden ratio is a mathematical relationship often associated with balance, proportion, and harmony. It appears when something is divided into two parts such that the ratio of the object to the larger part is the same as the ratio of the larger part to the smaller. This value is approximated to be 1.618034... . This proportion has been observed in art, architecture, and natural forms, and is frequently linked to aesthetic appeal.

While the golden ratio itself is not recursive, it emerges from recursive processes. One of the most well-known examples is the Fibonacci sequence, in which each number is defined by the sum of the two numbers before it. As this sequence continues, the ratio between successive numbers approaches the golden ratio. In this way, a simple recursive rule gradually produces a stable and recognizable structure.

I find this relationship interesting because it shows how recursion can lead to convergence rather than infinity. Unlike the ouroboros, which loops endlessly, the recursive process behind the golden ratio moves toward balance. Each step depends on the previous ones, but the system stabilizes instead of collapsing or spiraling outward.

The golden ratio suggests that recursion does not always result in repetition or excess. Given the right conditions, recursive growth can create order, proportion, and coherence. I think this is in stark contrast to more obvious recursive symbols, showing that recursion can be subtle, hidden, and structural rather than overtly circular.

Fibonacci (interactive)

Each term is the sum of the two before it: F(n) = F(n−1) + F(n−2). As the sequence grows, the ratio of consecutive terms approaches φ ≈ 1.618034.

Terms: 2/50
1 → 1
Last ratio 1 / 1 = 1.000000
Distance from φ |1.0000001.618034| = 0.618034

Fractals

Fractals are geometric forms generated through recursive rules, where the same pattern repeats at different scales. Unlike traditional shapes, fractals reveal new detail the closer you look. Zooming in does not simplify the image - it reproduces the structure again, often endlessly.

A single mathematical rule is applied repeatedly, and each iteration builds upon the last. This makes fractals a clear example of recursion made visible: the output of one step becomes the input for the next.

I find fractals especially compelling because they resist completion. Unlike the golden ratio, which converges toward balance, fractals never resolve into a final form. There is no natural stopping point. Each level suggests another level beneath it, creating a sense of infinite depth contained within a finite space.

Fractals appear not only in mathematics, but also in natural structures such as coastlines, plants, clouds, and branching systems. A well known example are snowflakes. These forms suggest that recursion is not just an abstract idea, but a practical method for organizing complexity.

Feedback loops

A feedback loop is a system in which the output of a process becomes an input to the same process. Instead of ending after one cycle, the result returns and influences what happens next. In this sense, feedback loops are recursion extended through time: the system continuously updates itself using its own outcomes.

In programming, feedback loops appear in different forms. Recursive functions are a direct expression of this idea, where a function calls itself with modified input until a base case is reached. Each call depends on the result of the previous one. Without a stopping condition, the function does not resolve and instead collapses into infinite recursion.

Feedback loops also appear at a larger scale in software systems. When training machine learning models, especially neural networks, the model produces an output that is evaluated against a target. The error from that evaluation is then fed back into the system to adjust the model’s parameters. This process repeats many times, gradually shaping the model’s behavior.

While this process is not recursion in the strict mathematical sense, it shares the same core structure: outputs become inputs, and patterns reinforce themselves through repetition. Small biases or errors can be amplified through positive feedback, while corrective mechanisms act as negative feedback to stabilize learning.

I find this connection interesting because it shows how recursive thinking scales from code to systems. Whether in a function, a program, or a learning model, feedback loops reveal how behavior emerges through repetition. This also highlights the importance of limits and intervention, a reminder that without constraints or reflection, recursive systems, both technical and human, can spiral out of control.

So...

Recursion describes how patterns repeat, but it also reveals where repetition must end. Across symbols, mathematics, natural forms, and systems, recursion appears as a powerful method for generating complexity. Yet in every example, its value depends on the presence of limits, balance, or intervention.

The ouroboros shows recursion without resolution. The golden ratio shows recursion that converges. Fractals reveal recursion that expands beyond completion. Feedback loops demonstrate how recursive processes shape real systems over time. Together, these examples inspire me on how I understand repetition, as a structure that carries memory and consequence.

For me, recursion is more than just a technical concept. It is a way of recognizing patterns in thinking, learning, and behavior. It explains how habits form, how systems stabilize or spiral, and how simple rules can produce outcomes far larger than their origins.

Every recursive process eventually needs a base case. In understanding recursion, I have learned not only how patterns persist, but also when they should be allowed to end.