llama.cpp/examples/server/webui/src/utils
characharm 8ca6e1c3a4
server : webui : Improve Chat Input with Auto-Sizing Textarea (#12785)
* Update ChatScreen.tsx

* useAutosizeTextarea.ts

useAutosizeTextarea to encapsulate the logic.

* Implement responsive auto-sizing chat textarea

Replaces the manual textarea resizing with an automatic height adjustment based on content.

- `useChatTextarea` hook to manage textarea state and auto-sizing logic via refs, preserving the optimization
- Textarea now grows vertically up to a maximum height (`lg:max-h-48`) on large screens (lg breakpoint and up).
- Disables auto-sizing and enables manual vertical resizing (`resize-vertical`) on smaller screens for better mobile usability.
- Aligns the "Send" button to the bottom of the textarea (`items-end`) for consistent positioning during resize.

* -update compressed index.html.gz after npm run build
-refactor: replace OptimizedTextareaValue with AutosizeTextareaApi in VSCode context hook

* chore: normalize line endings to LF
refactor: AutosizeTextareaApi -> chatTextareaApi

* refactor: Rename interface to PascalCase

---------

Co-authored-by: Xuan Son Nguyen <son@huggingface.co>
2025-04-08 11:14:59 +02:00
..
app.context.tsx server : (webui) Enable communication with parent html (if webui is in iframe) (#11940) 2025-02-18 23:01:44 +01:00
common.tsx server : (webui) revamp Settings dialog, add Pyodide interpreter (#11759) 2025-02-08 21:54:50 +01:00
llama-vscode.ts server : webui : Improve Chat Input with Auto-Sizing Textarea (#12785) 2025-04-08 11:14:59 +02:00
misc.ts webui : add ?m=... and ?q=... params (#12148) 2025-03-03 11:42:45 +01:00
storage.ts server : (webui) introduce conversation branching + idb storage (#11792) 2025-02-10 21:23:17 +01:00
types.ts server : (webui) Enable communication with parent html (if webui is in iframe) (#11940) 2025-02-18 23:01:44 +01:00