New York’s Algorithmic Pricing Disclosure Act imposes new disclosure requirements on businesses that set prices using algorithms informed by consumer personal data. The law took effect on November 10, 2025, and targets “surveillance” or “data-driven pricing,” practices in which personal information is factored into price adjustments for goods or services.
The act requires any covered business operating in New York to include a clear, conspicuous notice near the pricing offer or display stating: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.” The disclosure must be positioned directly alongside the price both online and in-store and must be visible and easily understandable to the consumer. The act broadly defines “personal data” to include any information that identifies or could reasonably be linked to an individual or their device. Personal data does not include location data used solely to calculate a fare based on mileage and trip duration between a passenger’s pickup and drop‑off locations.
This law applies to both online and physical stores, though it exempts certain businesses from the disclosure requirement, including:
Noncompliance may result in civil penalties up to $1,000 per occurrence. The New York attorney general enforces the law but must first issue a cease-and-desist letter that allows a cure period before pursuing penalties. The act does not provide a private right of action, does not require proof of consumer harm, and preserves the possibility of additional remedies under other state consumer protection laws.
Earlier in 2025, the National Retail Federation challenged the law, arguing that it violated the First Amendment because it forced businesses to express a government opinion without justification. The U.S. District Court for the Southern District of New York disagreed, finding the disclosure requirement “factual and uncontroversial” and reasonably related to the government’s legitimate interest in making sure consumers are well informed about the products offered to them, including pricing.
Changing prices based on data about consumers is not a new phenomenon. Various forms of “price discrimination,” the economic term for charging different prices to different buyers for the same product, have existed for more than a century. What has changed is the amount of personal data available, the ease of access to it, and the speed at which prices can be adjusted. These shifts have caused regulators and lawmakers to become increasingly concerned about pricing practices, especially in the housing, food, and retail industries, where risks include unfair discrimination, lack of transparency, and other distortions in the market.
For example, in 2024, the Federal Trade Commission (FTC) ordered eight companies that provide surveillance pricing products to conduct a study examining how these pricing practices affect consumers. The results of that study revealed that details such as location, demographics, and browsing history are commonly used to target individual consumers with different prices for the same goods. The staff research summaries highlighted that even subtle behavior data, including mouse movements and abandoned online shopping carts, are tracked and used by retailers to set prices. While the FTC initially showed a strong interest in this issue, current leadership has signaled that the study is no longer a priority, leaving state legislatures to take the lead in proposing laws addressing algorithmic pricing and personal data.
For business owners, the act represents a shift toward stronger consumer protection in pricing practices and increased regulatory scrutiny over how companies communicate their use of data. The act leaves open the possibility of a broad interpretation of algorithmic pricing, which means that even lawful price variations could draw attention if linked to personal data. Violations, intentional or not, may bring steep fines. Businesses should consider taking the following steps to reduce the risk of noncompliance: