SEC Clarifies Broker-Dealer Scope for Crypto Asset Securities Interfaces
The quiet reshaping of tokenized securities access in the U.S.
On April 13, 2026, the U.S. Securities and Exchange Commission, through its Division of Trading and Markets, released a staff statement addressing the application of broker-dealer registration requirements to certain user interfaces used to prepare transactions in crypto asset securities.
There is a tendency, when approaching statements of this kind, to focus on their formal status. Whether they bind. Whether they create rights. Whether they can be relied upon.
That instinct, while understandable, risks missing the more important development.
Because this statement is not primarily about formal rulemaking. It is about something more structural: the gradual redefinition of how access to securities markets may be organised when those markets are mediated through software rather than institutions.
And in that respect, its implications for tokenized securities are difficult to overstate.
Not a rule, but not neutral either
The document is carefully framed. It does not constitute a rule adopted by the Commission. It does not alter existing law. It does not establish a safe harbor. It carries no independent legal force and creates no new obligations for market participants.
But that framing should not be mistaken for irrelevance.
Statements of this kind operate as expressions of regulatory posture. They reveal how the staff understands the application of existing statutory concepts in new contexts. They indicate where enforcement sensitivity may lie, and where, at least for now, regulatory tolerance may exist.
In that sense, the document performs a dual function.
Formally, it does very little.
Practically, it does quite a lot.
It articulates a conditional space within which certain actors, specifically, those operating transaction-preparation interfaces for crypto asset securities, may exist without immediately triggering broker-dealer registration, provided their role remains carefully circumscribed.
The structural problem tokenized securities create
The underlying issue the statement addresses is not unique to crypto. It is a consequence of technological change.
U.S. securities regulation developed in an environment defined by intermediation. Investors did not interact directly with markets. They accessed them through brokers, who aggregated, routed, and executed orders within established infrastructures.
Tokenized securities disrupt that model at the level of architecture.
They introduce systems in which:
- investors can define transaction parameters directly,
- execution logic is embedded in protocols or smart contracts,
- custody may be self-directed,
- and the interface between the user and the market is software rather than an institution.
This creates an immediate tension.
If every layer that facilitates interaction with a securities market is treated as an intermediary, then the regulatory perimeter expands to capture actors that do not fit the traditional conception of a broker. If, on the other hand, those layers are treated as neutral tools, the question becomes how to ensure that substantive intermediation does not simply migrate into software form.
The statement is an attempt to resolve that tension without disturbing the statutory framework.
The key distinction: preparation versus participation
At the centre of the analysis lies a distinction that, while not new, is being applied here in a novel way.
The Exchange Act defines a broker as a person engaged in the business of effecting transactions in securities for the account of others.
The staff does not seek to redefine that concept. Instead, it seeks to clarify its boundary. The distinction is between systems that prepare transactions and systems that participate in them.
A system that prepares a transaction translates user-defined parameters into executable instructions. It makes interaction possible. It reduces complexity. It provides visibility into potential outcomes.
A system that participates in a transaction, by contrast, exercises influence over its terms, its execution, or its economic consequences. It may shape user decisions, direct order flow, or embed incentives that affect how and where transactions occur.
The statement suggests that the former category may, under certain conditions, fall outside broker-dealer registration. The latter does not.
The emergence of a neutrality standard
What follows from this distinction is the construction of what can best be understood as a neutrality standard.
The interface must remain, in substance, non-intermediating. It must not cross the line from enabling user action to influencing or shaping it.
This requirement is expressed not through a single rule, but through a set of interlocking constraints.
User decision-making must remain central. The interface may assist, but not displace judgment. Default parameters may exist, but must be customisable. Educational material may be provided, but must not shade into recommendation.
The prohibition on soliciting specific transactions reinforces this boundary. The interface provider may promote access to the tool, but may not encourage particular investment outcomes.
Even the presentation of information is regulated at the level of structure. Execution routes may be displayed, but not endorsed. Sorting may occur, but only according to objective, pre-defined criteria. Language that implies superiority or preference is excluded.
What emerges is a view of neutrality that is both behavioural and architectural. It is not enough that the provider refrains from explicit recommendation. The system itself must be designed in a way that avoids embedding implicit preferences.
Objectivity, transparency, and the limits of algorithmic discretion
This design constraint extends into the logic of the system.
The statement requires that the software underpinning transaction preparation and route display operate on parameters that are pre-disclosed, objective, and independently verifiable.
This is, in effect, a requirement of algorithmic transparency.
It reflects a concern that discretion can be exercised not only by human actors, but by systems whose internal logic is opaque to users. Where optimisation processes are not visible, they may embed priorities or incentives that are inconsistent with the characterisation of the interface as neutral.
For tokenized securities platforms, this introduces a significant constraint.
The more sophisticated and adaptive the system becomes, the more difficult it may be to demonstrate that it operates purely on objective, disclosed criteria. Intelligence, in this context, risks being reinterpreted as discretion.
Compensation as a proxy for function
Perhaps the most consequential aspect of the framework lies in its treatment of compensation.
The interface provider’s remuneration must be fixed, paid by the user, and independent of transaction-specific variables such as size, execution venue, or counterparty.
This requirement does more than address conflicts of interest. It operates as a proxy for functional classification.
In U.S. securities law, transaction-based compensation has long been associated with broker activity. Where an entity’s revenue depends on the occurrence or characteristics of securities transactions, it suggests a direct economic interest in those transactions and, by extension, participation in the transaction process.
By contrast, a fixed, user-paid fee aligns the provider’s incentives with access and functionality rather than execution outcomes.
In the context of tokenized securities, this distinction becomes particularly important. Many digital market structures rely on layered and indirect monetisation models, including routing incentives, affiliate arrangements, or spread-based economics. The exclusion of such models from the non-objection framework signals that neutrality is not only a matter of conduct, but of economic alignment.
The boundary of intermediation
The statement reinforces this approach by identifying a set of activities that fall outside the scope of the non-objection position.
These include negotiating transaction terms, making recommendations, arranging financing, handling assets, executing or settling transactions, and taking or routing orders.
Taken together, these functions describe the core of what has traditionally been understood as brokerage.
Their exclusion serves a dual purpose. It clarifies the limits of the permitted activity, and it provides a negative definition of intermediation in a software-based environment.
A layered market structure
What emerges from this analysis is a vision of market structure that is more modular than the traditional model.
The interface layer may, under constrained conditions, operate outside broker-dealer registration. The core functions of execution, custody, and settlement remain firmly within the regulatory perimeter.
The result is a layered system in which:
- access may be provided by software,
- execution occurs within regulated environments,
- and regulatory obligations attach to functions rather than to a single, vertically integrated entity.
For tokenized securities, this is a significant development. It suggests that the architecture of the market may evolve in ways that are structurally different from traditional systems, while remaining anchored in the same underlying regulatory principles.
The emerging design constraint
The practical implication of the statement is that regulatory analysis increasingly attaches to design.
It is no longer sufficient to ask whether an entity is registered, or whether a particular activity falls within a predefined category.
The relevant questions become more granular.
How are decisions made within the system?
How is information presented?
How are incentives structured?
How is the system monetised?
An interface that is architected as a neutral translation layer, governed by objective logic and funded through transparent user fees, may fall within the contours of the staff’s non-objection position. An interface that embeds optimisation, influence, or transaction-dependent economics may not.
Conclusion
The staff statement does not resolve the broader questions surrounding tokenized securities. Its scope is limited. Its authority is non-binding. Its future depends on further Commission action.
But it does provide something that has, until now, been largely absent: a structured framework for thinking about the role of software in securities market access.
The issue is not whether an interface exists. It is whether that interface, in substance, acts as an intermediary, or remains an instrument through which the user acts.
And in that distinction, between participation and facilitation, the contours of tokenized securities market structure in the United States are beginning to take shape.
Tokenize Your Assets Now
Start free today or book a demo to see how Evergon transforms your financial operations.


.png)


