Promote 0.1.16 to the current stable Walter On-Prem release
Improve the setup and admin LLM flows so provider checks, model suggestions, test, and save states are clearer and more consistent
Add more descriptive LLM validation feedback for local OpenAI-compatible hosts and for provider-side auth and endpoint failures
Operator notes
bin/walter-onprem upgrade now resolves to 0.1.16 for existing 0.1.x installs
OpenRouter credentials are now probed explicitly during validation, which avoids treating the public model catalog as proof that an API key is valid
Local providers such as LM Studio now surface connection-refused and timeout failures directly in the Walter UI, which makes post-upgrade LLM troubleshooting more obvious
No database migration step or bundle refresh is required for this release