Reading List

The most recent articles from a list of feeds I subscribe to.

Hollow Knight: Silksong DLC is coming in 2026

Hollow Knight: Silksong is getting a new expansion, Sea of Sorrow, in 2026, Team Cherry announced.

Cackling Referee Declares Penalty For Pass Interference Shall Be 10,000 Years Of Winter

PITTSBURGH—Raising his hands before him as his eyes turned ominously white, the referee of the Steelers–Dolphins game was heard to let out a blood-curdling cackle Monday before declaring the penalty for defensive pass interference would be “no fewer than 10,000 years of winter.” “Hear me, mortals, and know that for the grave transgression of hindering […]

The post Cackling Referee Declares Penalty For Pass Interference Shall Be 10,000 Years Of Winter appeared first on The Onion.

Sources: the US has paused a tech trade deal with the UK, signed in September, over disagreements about the UK's online safety rules and digital services taxes (New York Times)

New York Times:
Sources: the US has paused a tech trade deal with the UK, signed in September, over disagreements about the UK's online safety rules and digital services taxes  —  The U.S. government has paused a tech-focused trade pledge with Britain over broader disagreements about Britain's digital regulations and food safety rules.

Silksong is getting a free expansion next year

It's still hard to believe that Hollow Knight: Silksong actually came out this year, but now, we all have a new thing to wait for: the game is getting a free expansion in 2026, titled Sea of Sorrow. Team Cherry calls it the game's "first big expansion." "New areas, bosses, tools, and more!" Team Cherry […]

Allen Institute for AI launches Bolmo 7B and Bolmo 1B, claiming they are "the first fully open byte-level language models", built on its Olmo 3 models (Emilia David/VentureBeat)

Emilia David / VentureBeat:
Allen Institute for AI launches Bolmo 7B and Bolmo 1B, claiming they are “the first fully open byte-level language models”, built on its Olmo 3 models  —  Enterprises that want tokenizer-free multilingual models are increasingly turning to byte-level language models to reduce brittleness in noisy or low-resource text.