| ▲ | colinsane 15 hours ago | |||||||
sure. git clone https://github.com/nixos/nixpkgs ANTHROPIC_BASE_URL=https://openrouter.ai/api ANTHROPIC_AUTH_TOKEN=<make_an_account_on_openrouter_and_get_this_from_the_settings_panel> claude --model qwen/qwen3.6-plus:free > This repository has two ways of packaging Nix packages: defining them via pkgs/top-level/all-packages.nix (the old way); or defining them via the pkgs/by-name directory (the new way). Let's port my_example_package over to the new way. i'm not actually working in the nixpkgs repo -- i'm trying these in a private repo that has very similar structure. i'm also a n00b with these tools, so probably a bad prompt. but Qwen 3.6 actually conflates "the old way" with "the new way", attempts to do the porting in reverse, and just gets stuck. gemma-4 E4B does better. even gpt-oss-120b, an open weight model from a _year_ ago, does the full port unattended. so either it's shit at coding, or i'm using it wrong. curious to hear other anecdotes. | ||||||||
| ▲ | guteubvkk 14 hours ago | parent | next [-] | |||||||
gpt-oss-120b is vastly better than gemma-4 E4B | ||||||||
| ▲ | CamperBob2 11 hours ago | parent | prev [-] | |||||||
How does OpenRouter manage to run closed-weight models like Qwen 3.6? Did Qwen have to actually cooperate with them by contributing the weights? | ||||||||
| ||||||||