Skip to content

πŸ› fix(chat-input): preserve fullscreen editor state and send behavior#13481

Merged
Innei merged 2 commits intocanaryfrom
fix/chat-input-fullscreen-behavior
Apr 1, 2026
Merged

πŸ› fix(chat-input): preserve fullscreen editor state and send behavior#13481
Innei merged 2 commits intocanaryfrom
fix/chat-input-fullscreen-behavior

Conversation

@Innei
Copy link
Copy Markdown
Member

@Innei Innei commented Apr 1, 2026

πŸ’» Change Type

  • ✨ feat
  • πŸ› fix
  • ♻️ refactor
  • πŸ’„ style
  • πŸ‘· build
  • ⚑️ perf
  • βœ… test
  • πŸ“ docs
  • πŸ”¨ chore

πŸ”— Related Issue

Fixes LOBE-6603

πŸ”€ Description of Change

  • Preserve and restore editor JSON content when toggling chat input fullscreen mode.
  • Render fullscreen chat input through the desktop layout container portal to keep layout behavior consistent.
  • Adjust fullscreen keyboard behavior so Enter inserts a newline and Cmd/Ctrl+Enter sends.

πŸ§ͺ How to Test

  • Tested locally
  • Added/updated tests
  • No tests needed

πŸ“Έ Screenshots / Videos

Before After
N/A N/A

πŸ“ Additional Information

  • This change is scoped to desktop chat input behavior only.

Made with Cursor

Keep chat input content and interaction consistent when toggling fullscreen by restoring editor JSON state, adjusting Enter/Cmd+Enter semantics, and rendering fullscreen input in the desktop layout container.

Made-with: Cursor
@vercel
Copy link
Copy Markdown

vercel bot commented Apr 1, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
lobehub Ready Ready Preview, Comment Apr 1, 2026 4:55pm

Request Review

Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've reviewed this pull request using the Sourcery rules engine

Automatically exit fullscreen after sending from chat input so users do not need a second manual collapse action, and clear saved editor snapshot to avoid stale restore.

Made-with: Cursor
@lobehubbot
Copy link
Copy Markdown
Member

@ONLY-yours @canisminor1990 - This is a desktop chat input fix (fullscreen editor state preservation and send behavior). Please take a look.

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 1, 2026

Codecov Report

❌ Patch coverage is 0% with 6 lines in your changes missing coverage. Please review.
βœ… Project coverage is 66.48%. Comparing base (df6d8f1) to head (24c0dca).
⚠️ Report is 2 commits behind head on canary.

Additional details and impacted files
@@            Coverage Diff             @@
##           canary   #13481      +/-   ##
==========================================
- Coverage   66.48%   66.48%   -0.01%     
==========================================
  Files        1972     1972              
  Lines      160184   160189       +5     
  Branches    18340    18340              
==========================================
  Hits       106504   106504              
- Misses      53560    53565       +5     
  Partials      120      120              
Flag Coverage Ξ”
app 58.31% <0.00%> (-0.01%) ⬇️
database 96.66% <ΓΈ> (ΓΈ)
packages/agent-runtime 88.98% <ΓΈ> (ΓΈ)
packages/context-engine 86.51% <ΓΈ> (ΓΈ)
packages/conversation-flow 92.36% <ΓΈ> (ΓΈ)
packages/file-loaders 87.02% <ΓΈ> (ΓΈ)
packages/memory-user-memory 66.68% <ΓΈ> (ΓΈ)
packages/model-bank 99.85% <ΓΈ> (ΓΈ)
packages/model-runtime 84.67% <ΓΈ> (ΓΈ)
packages/prompts 67.07% <ΓΈ> (ΓΈ)
packages/python-interpreter 92.90% <ΓΈ> (ΓΈ)
packages/ssrf-safe-fetch 0.00% <ΓΈ> (ΓΈ)
packages/utils 90.41% <ΓΈ> (ΓΈ)
packages/web-crawler 88.82% <ΓΈ> (ΓΈ)

Flags with carried forward coverage won't be shown. Click here to find out more.

Components Coverage Ξ”
Store 66.66% <ΓΈ> (ΓΈ)
Services 49.27% <ΓΈ> (ΓΈ)
Server 66.34% <ΓΈ> (ΓΈ)
Libs 51.03% <ΓΈ> (ΓΈ)
Utils 89.08% <ΓΈ> (ΓΈ)
πŸš€ New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • πŸ“¦ JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@Innei Innei merged commit d8534c2 into canary Apr 1, 2026
40 of 41 checks passed
@Innei Innei deleted the fix/chat-input-fullscreen-behavior branch April 1, 2026 18:13
@lobehubbot
Copy link
Copy Markdown
Member

❀️ Great PR @Innei ❀️

The growth of project is inseparable from user feedback and contribution, thanks for your contribution! If you are interesting with the lobehub developer community, please join our discord and then dm @arvinxx or @canisminor1990. They will invite you to our private developer channel. We are talking about the lobe-chat development or sharing ai newsletter around the world.

arvinxx added a commit that referenced this pull request Apr 7, 2026
# πŸš€ release: 20260407

This release includes **148 commits**. Key updates are below.

- **Response API tool execution is more capable and reliable** β€” Added
hosted builtin tools + client-side function tools and improved tool-call
streaming/completion behavior.
[#13406](#13406)
[#13414](#13414)
[#13506](#13506)
[#13555](#13555)
- **Input and composition UX upgraded** β€” Added AI input auto-completion
and multiple chat-input stability fixes.
[#13458](#13458)
[#13551](#13551)
[#13481](#13481)
- **Model/provider compatibility improved** β€” Better Gemini/Google tool
schema handling and additional model updates.
[#13429](#13429)
[#13465](#13465)
[#13613](#13613)
- **Desktop and CLI reliability improved** β€” Gateway WebSocket support
and desktop runtime upgrades.
[#13608](#13608)
[#13550](#13550)
[#13557](#13557)
- **Security hardening continued** β€” Fixed auth and sanitization risks
and upgraded vulnerable dependencies.
[#13535](#13535)
[#13529](#13529)
[#13479](#13479)

### Models & Providers

- Added/updated support for `glm-5v-turbo`, GLM-5.1 updates, and
qwen3.5-omni series.
[#13487](#13487)
[#13405](#13405)
[#13422](#13422)
- Added additional ImageGen providers/models (Wanxiang 2.7 and Keling
from Qwen). [#13478](#13478)
- Improved Gemini/Google tool schema and compatibility handling across
runtime paths. [#13429](#13429)
[#13465](#13465)
[#13613](#13613)

### Response API & Runtime

- Added hosted builtin tools in Response API and client-side function
tool execution support.
[#13406](#13406)
[#13414](#13414)
- Improved stream tool-call argument handling and `response.completed`
output correctness.
[#13506](#13506)
[#13555](#13555)
- Improved runtime error/context handling for intervention and provider
edge cases. [#13420](#13420)
[#13607](#13607)

### Desktop App

- Bumped desktop dependencies and runtime integrations (`agent-browser`,
`electron`). [#13550](#13550)
[#13557](#13557)
- Simplified desktop release channel setup by removing nightly release
flow. [#13480](#13480)

### CLI

- Added OpenClaw migration command.
[#13566](#13566)
- Added local device binding support for `lh agent run`.
[#13277](#13277)
- Added WebSocket gateway support and reconnect reliability
improvements. [#13608](#13608)
[#13418](#13418)

### Security

- Removed risky `apiKey` fallback behavior in webapi auth path to
prevent bypass risk.
[#13535](#13535)
- Sanitized HTML artifact rendering and iframe sandboxing to reduce
XSS-to-RCE risk. [#13529](#13529)
- Upgraded nodemailer to v8 to address SMTP command injection advisory.
[#13479](#13479)

### Bug Fixes

- Fixed image generation model default switch issues.
[#13587](#13587)
- Fixed subtopic re-fork message scope behavior and agent panel reset
edge cases. [#13606](#13606)
[#13556](#13556)
- Fixed chat-input freeze on paste and mention plugin behavior.
[#13551](#13551)
[#13415](#13415)
- Fixed auth/social sign-in and settings UX edge cases.
[#13368](#13368)
[#13392](#13392)
[#13338](#13338)

### Credits

Huge thanks to these contributors:

@chriszf @hardy-one @Innei @lijian @neko @OctopusNote @rdmclin2
@rivertwilight @RylanCai @suyua9 @sxjeru @Tsuki @wangyk @WindSpiritSR
@yizhuo @YuTengjing @hezhijie0327 @arvinxx
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants